当前位置: X-MOL 学术Genet. Program. Evolvable Mach. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the hybridization of geometric semantic GP with gradient-based optimizers
Genetic Programming and Evolvable Machines ( IF 2.6 ) Pub Date : 2023-10-28 , DOI: 10.1007/s10710-023-09463-1
Gloria Pietropolli , Luca Manzoni , Alessia Paoletti , Mauro Castelli

Geometric semantic genetic programming (GSGP) is a popular form of GP where the effect of crossover and mutation can be expressed as geometric operations on a semantic space. A recent study showed that GSGP can be hybridized with a standard gradient-based optimized, Adam, commonly used in training artificial neural networks.We expand upon that work by considering more gradient-based optimizers, a deeper investigation of their parameters, how the hybridization is performed, and a more comprehensive set of benchmark problems. With the correct choice of hyperparameters, this hybridization improves the performances of GSGP and allows it to reach the same fitness values with fewer fitness evaluations.



中文翻译:

几何语义 GP 与基于梯度的优化器的混合

几何语义遗传规划(GSGP)是 GP 的一种流行形式,其中交叉和变异的效果可以表示为语义空间上的几何运算。最近的一项研究表明,GSGP 可以与标准的基于梯度的优化 Adam 混合,Adam 常用于训练人工神经网络。我们通过考虑更多基于梯度的优化器、更深入地研究它们的参数、混合如何混合来扩展这项工作。执行,以及一组更全面的基准问题。通过正确选择超参数,这种混合提高了 GSGP 的性能,并使其能够以更少的适应度评估达到相同的适应度值。

更新日期:2023-10-30
down
wechat
bug