当前位置: X-MOL 学术Genet. Program. Evolvable Mach. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Constant optimization and feature standardization in multiobjective genetic programming
Genetic Programming and Evolvable Machines ( IF 2.6 ) Pub Date : 2021-08-19 , DOI: 10.1007/s10710-021-09410-y
Peter Rockett 1
Affiliation  

This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuning – with and without feature standardization – and observe that (1) constant tuning invariably improves test error, and (2) usually decreases tree size. Combined with standardization, constant tuning produces the best test error results; tree sizes, however, are increased. We also examine the effects of applying constant tuning only once at the end a conventional GP run which turns out to be surprisingly promising. Finally, we consider the merits of using numerical procedures to tune tree constants and observe that for around half the datasets evolutionary search alone is superior whereas for the remaining half, parameter tuning is superior. We identify a number of open research questions that arise from this work.



中文翻译:

多目标遗传规划中的不断优化和特征标准化

本文将遗传规划 (GP) 中树常数的数值调整扩展到多目标域。使用十个真实世界的基准回归数据集并采用贝叶斯比较程序,我们首先考虑特征标准化的影响(没有不断调整)并得出结论,标准化通常会产生较低的测试错误,但是,与其他最近发表的工作相反,我们发现更少树木大小的明显趋势。此外,我们考虑了不断调整的影响 - 有和没有特征标准化 - 并观察到(1)不断调整总是会改善测试错误,并且(2)通常会减少树的大小。结合标准化,不断调整产生最佳的测试误差结果;然而,树的大小增加了。我们还检查了在常规 GP 运行结束时仅应用一次恒定调整的效果,结果令人惊讶地有希望。最后,我们考虑使用数值程序来调整树常数的优点,并观察到大约一半的数据集仅进化搜索是优越的,而对于剩下的一半,参数调整是优越的。我们确定了从这项工作中产生的许多开放研究问题。

更新日期:2021-08-19
down
wechat
bug