当前位置: X-MOL 学术Numer. Algor. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Globally linearly convergent nonlinear conjugate gradients without Wolfe line search
Numerical Algorithms ( IF 2.1 ) Pub Date : 2024-02-09 , DOI: 10.1007/s11075-024-01764-5
Arnold Neumaier , Morteza Kimiaei , Behzad Azmi

This paper introduces a measure for zigzagging strength and a minimal zigzagging direction. Based on this, a new nonlinear conjugate gradient (CG) method is proposed that works with line searches not satisfying the Wolfe condition. Global convergence to a stationary point is proved for differentiable objective functions with Lipschitz continuous gradient, and global linear convergence if this stationary point is a strong local minimizer. For approximating a stationary point, an \(\mathcal{O}(\varepsilon ^{-2})\) complexity bound is derived for the number of function and gradient evaluations. This bound improves to \(\mathcal{O}(\log \varepsilon ^{-1})\) for objective functions having a strong minimizer and no other stationary points. For strictly convex quadratic functions in n variables, the new method terminates in at most n iterations. Numerical results on the unconstrained CUTEst test problems suggest that the new method is competitive with the best nonlinear state-of-the-art CG methods proposed in the literature.



中文翻译:

无需 Wolfe 线搜索的全局线性收敛非线性共轭梯度

本文介绍了锯齿形强度的测量方法和最小锯齿形方向。在此基础上,提出了一种新的非线性共轭梯度(CG)方法,该方法适用于不满足沃尔夫条件的线搜索。对于具有 Lipschitz 连续梯度的可微目标函数,证明了到驻点的全局收敛性,并且如果该驻点是强局部极小值,则证明了全局线性收敛。为了逼近驻点,我们针对函数和梯度评估的数量导出了\(\mathcal{O}(\varepsilon ^{-2})\)复杂度界限。对于具有强最小化器且没有其他驻点的目标函数,此界限改进为\(\mathcal{O}(\log \varepsilon ^{-1})\) 。对于n 个变量的严格凸二次函数,新方法最多在n 次迭代中终止。无约束CUTEst测试问题的数值结果表明,新方法与文献中提出的最先进的非线性 CG 方法具有竞争力。

更新日期:2024-02-10
down
wechat
bug