当前位置: X-MOL 学术Mathematics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Automatic Differentiation-Based Multi-Start for Gradient-Based Optimization Methods
Mathematics ( IF 2.4 ) Pub Date : 2024-04-17 , DOI: 10.3390/math12081201
Francesco Della Santa 1, 2
Affiliation  

In global optimization problems, diversification approaches are often necessary to overcome the convergence toward local optima. One approach is the multi-start method, where a set of different starting configurations are taken into account to designate the best local minimum returned by the multiple optimization procedures as the (possible) global optimum. Therefore, parallelization is crucial for multi-start. In this work, we present a new multi-start approach for gradient-based optimization methods that exploits the reverse Automatic Differentiation to perform efficiently. In particular, for each step, this Automatic Differentiation-based method is able to compute the N gradients of N optimization procedures extremely quickly, exploiting the implicit parallelization guaranteed by the computational graph representation of the multi-start problem. The practical advantages of the proposed method are illustrated by analyzing the time complexity from a theoretical point of view and showing numerical examples where the speed-up is between ×40 and ×100, with respect to classic parallelization methods. Moreover, we show that our AD-based multi-start approach can be implemented by using tailored shallow Neural Networks, taking advantage of the built-in optimization procedures of the Deep Learning frameworks.

中文翻译:

基于自动微分的多启动梯度优化方法

在全局优化问题中,通常需要采用多样化方法来克服局部最优的收敛性。一种方法是多启动方法,其中考虑一组不同的启动配置,以将多个优化过程返回的最佳局部最小值指定为(可能的)全局最优值。因此,并行化对于多启动至关重要。在这项工作中,我们提出了一种新的基于梯度的优化方法的多启动方法,该方法利用反向自动微分来高效执行。特别是,对于每个步骤,这种基于自动微分的方法能够非常快速地计算 N 个优化过程的 N 个梯度,利用多启动问题的计算图表示所保证的隐式并行化。通过从理论角度分析时间复杂度并展示数值示例,说明了该方法的实际优势,相对于经典并行化方法,加速比在×40到×100之间。此外,我们还表明,我们基于 AD 的多启动方法可以通过使用定制的浅层神经网络来实现,利用深度学习框架的内置优化程序。
更新日期:2024-04-17
down
wechat
bug