当前位置: X-MOL 学术Appl. Math. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Factor- $$\sqrt{2}$$ Acceleration of Accelerated Gradient Methods
Applied Mathematics and Optimization ( IF 1.8 ) Pub Date : 2023-08-23 , DOI: 10.1007/s00245-023-10047-9
Chanwoo Park , Jisun Park , Ernest K. Ryu

The optimized gradient method (OGM) provides a factor-\(\sqrt{2}\) speedup upon Nesterov’s celebrated accelerated gradient method in the convex (but non-strongly convex) setup. However, this improved acceleration mechanism has not been well understood; prior analyses of OGM relied on a computer-assisted proof methodology, so the proofs were opaque for humans despite being verifiable and correct. In this work, we present a new analysis of OGM based on a Lyapunov function and linear coupling. These analyses are developed and presented without the assistance of computers and are understandable by humans. Furthermore, we generalize OGM’s acceleration mechanism and obtain a factor-\(\sqrt{2}\) speedup in other setups: acceleration with a simpler rational stepsize, the strongly convex setup, and the mirror descent setup.



中文翻译:

Factor- $$\sqrt{2}$$ 加速梯度法的加速

优化梯度法 (OGM) 比Nesterov 在凸(但非强凸)设置中著名的加速梯度法提供了\(\sqrt{2}\)倍的加速。然而,这种改进的加速机制尚未得到很好的理解;之前对 OGM 的分析依赖于计算机辅助证明方法,因此尽管证明是可验证且正确的,但对人类来说是不透明的。在这项工作中,我们提出了一种基于 Lyapunov 函数和线性耦合的 OGM 的新分析。这些分析是在没有计算机帮助的情况下开发和呈现的,并且是人类可以理解的。此外,我们推广了 OGM 的加速机制并获得了一个因子 - \(\sqrt{2}\)其他设置中的加速:使用更简单的合理步长加速、强凸设置和镜像下降设置。

更新日期:2023-08-24
down
wechat
bug