Abstract
In this paper we consider the problem of finding the optimal step length for the Newton method on the class of self-concordant functions, with the decrease in function value as criterion. We formulate this problem as an optimal control problem and use optimal control theory to solve it.
Similar content being viewed by others
References
Burdakov, O.P.: Some globally convergent modifications of Newton’s method for solving systems of nonlinear equations. Dokl. Akad. Nauk SSSR 254(3), 521–523 (1980)
De Klerk, E., Glineur, F., Taylor, A.B.: Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation. SIAM J. Optim. 30(3), 2053–2082 (2020)
Drori, Y., Teboulle, M.: Performance of first-order methods for smooth convex minimization: a novel approach. Math. Program. 145(1–2), 451–482 (2014)
Gao, W., Goldfarb, D.: Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions. Optimiz. Methods Softw. 34(1), 194–217 (2019)
Hildebrand, R.: Optimal step length for the Newton method: case of self-concordant functions. Math. Methods Oper. Res. 94, 253–279 (2021)
Hu, B., Lessard, L.: Control interpretations for first-order optimization methods. In: 2017 American Control Conference (ACC), pages 3114–3119. IEEE, (2017)
Lessard, L., Recht, B., Packard, A.: Analysis and design of optimization algorithms via integral quadratic constraints. SIAM J. Optim. 26(1), 57–95 (2016)
Nesterov, Y.: Lectures on Convex Optimization, volume 137 of Springer Optimization and its Applications. Springer, (2018)
Nesterov, Y., Nemirovskii, A.: Interior-point polynomial algorithms in convex programming, volume 13. SIAM, (1994)
Pontryagin, L., Boltyanskii, V., Gamkrelidze, R., Mischchenko, E.: The Mathematical Theory of Optimal Processes. Wiley, New York, London (1962)
Ralph, D.: Global convergence of damped Newton’s method for nonsmooth equations via the path search. Math. Oper. Res. 19(2), 352–389 (1994)
Renegar, J.: A Mathematical View of Interior-point Methods in Convex Optimization. MPS-SIAM Series on Optimization. SIAM, MPS (2001)
Taylor, A., Van Scoy, B., Lessard, L.: Lyapunov functions for first-order methods: Tight automated convergence guarantees. In: International Conference on Machine Learning, pages 4897–4906. PMLR, (2018)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Ivanova, A., Hildebrand, R. Optimal step length for the maximal decrease of a self-concordant function by the Newton method. Optim Lett 18, 847–854 (2024). https://doi.org/10.1007/s11590-023-02035-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-023-02035-3