Skip to main content
Log in

A fast primal-dual algorithm via dynamical system with variable mass for linearly constrained convex optimization

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

We aim to solve the linearly constrained convex optimization problem whose objective function is the sum of a differentiable function and a non-differentiable function. We first propose an inertial continuous primal-dual dynamical system with variable mass for linearly constrained convex optimization problems with differentiable objective functions. The dynamical system is composed of a second-order differential equation with variable mass for the primal variable and a first-order differential equation for the dual variable. The fast convergence properties of the proposed dynamical system are proved by constructing a proper energy function. We then extend the results to the case where the objective function is non-differentiable, and a new accelerated primal-dual algorithm is presented. When both variable mass and time scaling satisfy certain conditions, it is proved that our new algorithm owns fast convergence rates for the objective function residual and the feasibility violation. Some preliminary numerical results on the \(\ell _{1}\)\(\ell _{2}\) minimization problem demonstrate the validity of our algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Alvarez, F.: On the minimizing property of a second order dissipative system in Hilbert spaces. SIAM J. Control Optim. 38(4), 1102–1119 (2000)

    Article  MathSciNet  Google Scholar 

  2. Attouch, H., Balhag, A., Chbani, Z., Riahi, H.: Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling. Evol. Equ. Control Theory 11(2), 487–514 (2022)

    Article  MathSciNet  Google Scholar 

  3. Attouch, H., Bot, R.I., Nguyen, D.K.: Fast convex optimization via closed-loop time scaling of gradient dynamics (2023). arXiv preprint arXiv:2301.00701

  4. Attouch, H., Chbani, Z., Fadili, J., Riahi, H.: Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics. J. Optim. Theory Appl. 193(1–3), 704–736 (2022)

    Article  MathSciNet  Google Scholar 

  5. Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Program. 168, 123–175 (2018)

    Article  MathSciNet  Google Scholar 

  6. Attouch, H., Chbani, Z., Riahi, H.: Fast proximal methods via time scaling of damped inertial dynamics. SIAM J. Optim. 29(3), 2227–2256 (2019)

    Article  MathSciNet  Google Scholar 

  7. Balhag, A., Chbani, Z., Riahi, H.: Linear convergence of inertial gradient dynamics with constant viscous damping coefficient and time-dependent rescaling parameter. hal-02610699 (2020)

  8. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces, 2nd edn. Springer, New York (2016)

    Google Scholar 

  9. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)

    Article  MathSciNet  Google Scholar 

  10. Brezis, H.: Operateurs Maximaux Monotones et Semi-groupes de Contractions Dans les Espaces de Hilbert. Elsevier, Amsterdam (1973)

    Google Scholar 

  11. Castera, C., Attouch, H., Fadili, J., Ochs, P.: Continuous Newton-like methods featuring inertia and variable mass (2023). arXiv preprint arXiv:2301.08726

  12. Combettes, P.L., Vũ, B.C.: Variable Metric Quasi-Fejer Monotonicity. Nonlinear Anal. 78, 17–31 (2013)

    Article  MathSciNet  Google Scholar 

  13. Eckstein, J., Silva, P.J.S.: A practical relative error criterion for augmented Lagrangians. Math. Program. 141(1–2), 319–348 (2013)

    Article  MathSciNet  Google Scholar 

  14. Haraux, A., Jendoubi, M.A.: Systèmes dynamiques dissipatifs et applications. Rech. Math. Appl. 17, Masson, Paris (1991)

  15. He, B., Yuan, X.: On the acceleration of augmented Lagrangian method for linearly constrained optimization. Optimization 3, 1–8 (2010)

    Google Scholar 

  16. He, X., Hu, R., Fang, Y.P.: Convergence rates of inertial primal-dual dynamical methods for separable convex optimization problems. SIAM J. Control Optim. 59(5), 3278–3301 (2021)

    Article  MathSciNet  Google Scholar 

  17. He, X., Hu, R., Fang, Y.P.: Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem. Automatica 146, 110547 (2022)

    Article  MathSciNet  Google Scholar 

  18. He, X., Hu, R., Fang, Y.P.: Second-order primal + first-order dual dynamical systems with time scaling for linear equality constrained convex optimization problems. IEEE Trans. Autom. Control 67(8), 4377–4383 (2022)

    Article  MathSciNet  Google Scholar 

  19. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4(5), 303–320 (1969)

    Article  MathSciNet  Google Scholar 

  20. Kang, M., Yun, S., Woo, H., Kang, M.: Accelerated Bregman method for linearly constrained \(\ell 1{-}\ell 2\) minimization. J. Sci. Comput. 56(3), 515–534 (2013)

  21. Kang, M., Kang, M., Jung, M.: Inexact accelerated augmented Lagrangian methods. Comput. Optim. Appl. 62, 373–404 (2015)

    Article  MathSciNet  Google Scholar 

  22. Khanh, P.D., Mordukhovich B.S., Tran, D.B.: A new inexact gradient descent method with applications to nonsmooth convex optimization (2023). arXiv preprint arXiv:2303.08785

  23. Khanh, P.D., Mordukhovich, B.S., Tran, D.B.: Inexact proximal methods for weakly convex functions (2023). arXiv preprint arXiv:2307.15596

  24. Lemaréchal, C., Sagastizábal, C.: Practical aspects of the Moreau–Yosida regularization: theoretical preliminaries. SIAM J. Optim. 7(2), 367–385 (1997)

    Article  MathSciNet  Google Scholar 

  25. Lin, T., Jordan, M.I.: A control-theoretic perspective on optimal high-order optimization. Math. Program. 195, 929–975 (2022)

    Article  MathSciNet  Google Scholar 

  26. Luo, H.: Accelerated primal-dual methods for linearly constrained convex optimization problems (2021). arXiv preprint arXiv:2109.12604

  27. Luo, H.: A primal-dual flow for affine constrained convex optimization. ESAIM Control Optim. Calc. 28, 33 (2022)

    Article  MathSciNet  Google Scholar 

  28. Moreau, J.J.: Proximité et dualité dans un espace Hilbertien. B. Soc. Math. Fr. 93, 273–299 (1965)

    Article  Google Scholar 

  29. Nesterov, Y.E.: A method of solving a convex programming problem with convergence rate \(\mathcal{O} (1/k^{2})\). Soviet Math. Doklady 27(2), 372–376 (1983)

  30. Osher, S., Mao, Y., Dong, B., Yin, W.: Fast linearized Bregman iteration for compressive sensing and sparse denoising. Commun. Math. Sci. 8, 93–111 (2010)

    Article  MathSciNet  Google Scholar 

  31. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Optimization, pp. 283–298 (1969)

  32. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1(2), 97–116 (1976)

    Article  MathSciNet  Google Scholar 

  33. Shi, B., Du, S.S., Jordan, M.I., Su, W.J.: Understanding the acceleration phenomenon via high-resolution differential equations. Math. Program. 195, 79–148 (2022)

    Article  MathSciNet  Google Scholar 

  34. Su, W., Boyd, S., Candes, E.: A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. J. Mach. Learn. Res. 17(153), 5312–5354 (2016)

    MathSciNet  Google Scholar 

  35. Teschl, G.: Ordinary differential equations and dynamical systems. J. Am. Math. Soc (2012)

  36. Wibisono, A., Wilson, A.C., Jordan, M.I.: A variational perspective on accelerated methods in optimization. Proc. Natl. Acad. Sci. U.S.A. 113(47), E7351–E7358 (2016)

    Article  MathSciNet  Google Scholar 

  37. Wilson, A.C., Recht, B., Jordan, M.I.: A Lyapunov analysis of accelerated methods in optimization. J. Mach. Learn. Res. 22(1), 5040–5073 (2021)

    MathSciNet  Google Scholar 

  38. Xu, Y.: Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming. SIAM J. Optim. 27(3), 1459–1484 (2017)

    Article  MathSciNet  Google Scholar 

  39. Yin, W.: Analysis and generalizations of the linearized Bregman method. SIAM J. Imaging Sci. 3(4), 856–877 (2010)

    Article  MathSciNet  Google Scholar 

  40. Zeng, X., Lei, J., Chen, J.: Dynamical primal-dual Nesterov accelerated method and its application to network optimization. IEEE Trans. Autom. Control 68(3), 1760–1767 (2023)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China Grants 12071108 and 11671116.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xinwei Liu.

Ethics declarations

Data generated or analyzed

All data generated or analyzed during this study are included in this article and are also available from the corresponding author on reasonable request.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jiang, Z., Wang, D. & Liu, X. A fast primal-dual algorithm via dynamical system with variable mass for linearly constrained convex optimization. Optim Lett (2024). https://doi.org/10.1007/s11590-023-02091-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11590-023-02091-9

Keywords

Navigation