Skip to main content
Log in

Generalized Newton Method with Positive Definite Regularization for Nonsmooth Optimization Problems with Nonisolated Solutions

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

We propose a coderivative-based generalized regularized Newton method with positive definite regularization term (GRNM-PD) to solve \(C^{1,1}\) optimization problems. In GRNM-PD, a general positive definite symmetric matrix is used to regularize the generalized Hessian, in contrast to the recently proposed GRNM, which uses the identity matrix. Our approach features global convergence and fast local convergence rate even for problems with nonisolated solutions. To this end, we introduce the p-order semismooth\({}^*\) property which plays the same role in our analysis as Lipschitz continuity of the Hessian does in the \(C^2\) case. Imposing only the metric q-subregularity of the gradient at a solution, we establish global convergence of the proposed algorithm as well as its local convergence rate, which can be superlinear, quadratic, or even higher than quadratic, depending on an algorithmic parameter \(\rho \) and the regularity parameters p and q. Specifically, choosing \(\rho \) to be one, we achieve quadratic local convergence rate under metric subregularity and the strong semismooth\({^*}\) property. The algorithm is applied to a class of nonsmooth convex composite minimization problems through the machinery of forward–backward envelope. The greater flexibility in the choice of regularization matrices leads to notable improvement in practical performance. Numerical experiments on box-constrained quadratic programming problems demonstrate the efficiency of our algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1

Similar content being viewed by others

Notes

  1. Although [3, Theorem 4.2] is stated for globally defined functions, its proof also works in the local case.

  2. If \(\rho \ne 0\), this follows from the definition \(\mu _k:=c\Vert \nabla \varphi _\gamma (x^k)\Vert ^\rho \) and the fact that \(x^k\) converges to a stationary point of the FBE \(\varphi _\gamma \), which is the same as a global minimizer of the BQP (by Proposition 5.1). If \(\rho =0\), then \(\mu _k=c\) and the assumption \(\mu _k\approx 0\) may not be true. However, in our numerical experiments, we only used nonzero values of \(\rho \).

  3. This is proved in Lemma 4.3.

  4. https://github.com/MatOpt/SuiteLasso

  5. As implemented in SLEP: http://yelabs.net/software/SLEP/

  6. https://web.stanford.edu/~boyd/papers/admm/lasso/lasso.html

  7. https://www.gurobi.com

References

  1. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imag. Sci. 2(1), 183–202 (2009)

    MathSciNet  Google Scholar 

  2. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 1–27 (2011)

    Google Scholar 

  3. Chieu, N., Chuong, T., Yao, J.C., Yen, N.: Characterizing convexity of a function by its Fréchet and limiting second-order subdifferentials. Set-Valued Var. Anal. 19(1), 75–96 (2011)

    MathSciNet  Google Scholar 

  4. Conn, A.R., Gould, N.I.M., Toint, P.L.: Testing a class of methods for solving minimization problems with simple bounds on the variables. Math. Comput. 50, 399–430 (1988)

    MathSciNet  Google Scholar 

  5. Dan, H., Yamashita, N., Fukushima, M.: Convergence properties of the inexact Levenberg–Marquardt method under local error bound conditions. Optimization Methods and Software 17(4), 605–626 (2002)

    MathSciNet  Google Scholar 

  6. Ding, C., Sun, D., Sun, J., Toh, K.C.: Spectral operators of matrices: semismoothness and characterizations of the generalized Jacobian. SIAM J. Optim. 30(1), 630–659 (2020)

    MathSciNet  Google Scholar 

  7. Drusvyatskiy, D., Mordukhovich, B.S., Nghia, T.T.: Second-order growth, tilt stability, and metric regularity of the subdifferential. J. Convex Anal. 21(4), 1165–1192 (2014)

    MathSciNet  Google Scholar 

  8. Facchinei, F., Pang, J.S.: Finite-dimensional variational inequalities and complementarity problems. Springer, Berlin (2003)

    Google Scholar 

  9. Fan, J.Y., Yuan, Y.X.: On the quadratic convergence of the Levenberg–Marquardt method without nonsingularity assumption. Computing 74, 23–39 (2005)

    MathSciNet  Google Scholar 

  10. Fischer, A.: Local behavior of an iterative framework for generalized equations with nonisolated solutions. Math. Program. 94(1), 91–124 (2002)

    MathSciNet  Google Scholar 

  11. Fischer, A., Shukla, P., Wang, M.: On the inexactness level of robust Levenberg–Marquardt methods. Optimization 59(2), 273–287 (2010)

    MathSciNet  Google Scholar 

  12. Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput. Math. Appl. 2(1), 17–40 (1976)

    Google Scholar 

  13. Gfrerer, H.: On directional metric regularity, subregularity and optimality conditions for nonsmooth mathematical programs. Set-Valued Var. Anal. 21, 151–176 (2013)

    MathSciNet  Google Scholar 

  14. Gfrerer, H., Outrata, J.V.: On a semismooth* Newton method for solving generalized equations. SIAM J. Optim. 31(1), 489–517 (2021)

    MathSciNet  Google Scholar 

  15. Ginchev, I., Mordukhovich, B.S.: On directionally dependent subdifferentials. C. R. Acad. Bulg. Sci. 64, 497–508 (2011)

    MathSciNet  Google Scholar 

  16. Glowinski, R., Marroco, A.: Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité d’une classe de problèmes de Dirichlet non linéaires. Revue Française d’Automatique, Informatique, Recherche Opérationnelle. Analyse Numérique 9(2), 41–76 (1975)

    MathSciNet  Google Scholar 

  17. Izmailov, A.F., Solodov, M.V.: Newton-type methods for optimization and variational problems. Springer, Cham (2014)

    Google Scholar 

  18. Khanh, P.D., Mordukhovich, B., Phat, V.T.: A generalized Newton method for subgradient systems. Math. Oper. Res. (2022). https://doi.org/10.1287/moor.2022.1320

    Article  Google Scholar 

  19. Khanh, P.D., Mordukhovich, B.S., Phat, V.T., Tran, D.B.: Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials. J. Global Optim. 86(1), 93–122 (2023)

    MathSciNet  Google Scholar 

  20. Khanh, P.D., Mordukhovich, B.S., Phat, V.T., Tran, D.B.: Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization. Math. Program. (2023). https://doi.org/10.1007/s10107-023-01980-2

    Article  Google Scholar 

  21. Kummer, B.: Newton’s method for non-differentiable functions. In: Guddat, J. (ed.) Advances in mathematical optimization, pp. 114–124. Akademi-Verlag, Berlin (1988)

    Google Scholar 

  22. Li, D.H., Fukushima, M., Qi, L., Yamashita, N.: Regularized Newton methods for convex minimization problems with singular solutions. Comput. Optim. Appl. 28(2), 131–147 (2004)

    MathSciNet  Google Scholar 

  23. Li, X., Sun, D., Toh, K.C.: A highly efficient semismooth Newton augmented Lagrangian method for solving LASSO problems. SIAM J. Optim. 28(1), 433–458 (2018)

  24. Li, X., Sun, D., Toh, K.C.: On efficiently solving the subproblems of a level-set method for fused LASSO problems. SIAM J. Optim. 28(2), 1842–1866 (2018)

    MathSciNet  Google Scholar 

  25. Li, X., Sun, D., Toh, K.C.: An asymptotically superlinearly convergent semismooth Newton augmented Lagrangian method for linear programming. SIAM J. Optim. 30(3), 2410–2440 (2020)

    MathSciNet  Google Scholar 

  26. Lin, M., Liu, Y.J., Sun, D., Toh, K.C.: Efficient sparse semismooth Newton methods for the clustered LASSO problem. SIAM J. Optim. 29(3), 2026–2052 (2019)

    MathSciNet  Google Scholar 

  27. Luo, Z., Sun, D., Toh, K.C., Xiu, N.: Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method. J. Mach. Learn. Res. 20(106), 1–25 (2019)

    MathSciNet  Google Scholar 

  28. Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control. Optim. 15(6), 959–972 (1977)

    MathSciNet  Google Scholar 

  29. Mordukhovich, B.S.: Sensitivity analysis in nonsmooth optimization. Theor. Aspects Ind. Des. 58, 32–46 (1992)

    MathSciNet  Google Scholar 

  30. Mordukhovich, B.S.: Variational analysis and generalized differentiation, I: Basic Theory, II: applications, vol. 331. Springer, Berlin (2006)

    Google Scholar 

  31. Mordukhovich, B.S.: Variational analysis and applications. Springer, Cham (2018)

    Google Scholar 

  32. Mordukhovich, B.S., Ouyang, W.: Higher-order metric subregularity and its applications. J. Global Optim. 63(4), 777–795 (2015)

    MathSciNet  Google Scholar 

  33. Mordukhovich, B.S., Rockafellar, R.T.: Second-order subdifferential calculus with applications to tilt stability in optimization. SIAM J. Optim. 22(3), 953–986 (2012)

    MathSciNet  Google Scholar 

  34. Mordukhovich, B.S., Sarabi, M.E.: Generalized Newton algorithms for tilt-stable minimizers in nonsmooth optimization. SIAM J. Optim. 31(2), 1184–1214 (2021)

    MathSciNet  Google Scholar 

  35. Mordukhovich, B.S., Yuan, X., Zeng, S., Zhang, J.: A globally convergent proximal Newton-type method in nonsmooth convex optimization. Math. Program. 198(1), 899–936 (2023)

    MathSciNet  Google Scholar 

  36. Moré, J.J., Toraldo, G.: On the solution of large quadratic programming problems with bound constraints. SIAM J. Optim. 1(1), 93–113 (1991)

    MathSciNet  Google Scholar 

  37. Nesterov, Y.: Lectures on convex optimization. Springer, Berlin (2018)

    Google Scholar 

  38. Pang, J.S., Qi, L.: A globally convergent Newton method for convex SC1 minimization problems. J. Optim. Theory Appl. 85(3), 633–648 (1995)

    MathSciNet  Google Scholar 

  39. Patrinos, P., Bemporad, A.: Proximal Newton methods for convex composite optimization. In: 52nd IEEE Conference on decision and control, pp. 2358–2363. IEEE (2013)

  40. Poliquin, R.A., Rockafellar, R.T.: Prox-regular functions in variational analysis. Trans. Am. Math. Soc. 348, 1805–1838 (1996)

    MathSciNet  Google Scholar 

  41. Polyak, R.A.: Regularized Newton method for unconstrained convex optimization. Math. Program. 120(1), 125–145 (2009)

    MathSciNet  Google Scholar 

  42. Qi, L., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58(1), 353–367 (1993)

    MathSciNet  Google Scholar 

  43. Rockafellar, R.T.: Convex analysis, vol. 18. Princeton University Press, Princeton (1970)

    Google Scholar 

  44. Rockafellar, R.T., Wets, J.B.: Variational analysis. Springer, Berlin (2009)

    Google Scholar 

  45. Sun, D.: A further result on an implicit function theorem for locally Lipschitz functions. Oper. Res. Lett. 28(4), 193–198 (2001)

    MathSciNet  Google Scholar 

  46. Sun, D., Sun, J.: Semismooth matrix-valued functions. Math. Oper. Res. 27(1), 150–169 (2002)

    MathSciNet  Google Scholar 

  47. Tang, P., Wang, C., Sun, D., Toh, K.C.: A sparse semismooth Newton based proximal majorization-minimization algorithm for nonconvex square-root-loss regression problems. J. Mach. Learn. Res. 21(226), 1–38 (2020)

    MathSciNet  Google Scholar 

  48. Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. Roy. Stat. Soc.: Ser. B (Methodol.) 58(1), 267–288 (2018)

    MathSciNet  Google Scholar 

  49. Yamashita, N., Fukushima, M.: On the rate of convergence of the Levenberg–Marquardt method. In: Alefeld, G., Chen, X. (eds.) Topics in numerical analysis: with special emphasis on nonlinear problems, pp. 239–249. Springer, Berlin (2001)

    Google Scholar 

  50. Yang, L., Sun, D., Toh, K.C.: SDPNAL\(+\): a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints. Math. Program. Comput. 7(3), 331–366 (2015)

    MathSciNet  Google Scholar 

  51. Yue, M.C., Zhou, Z., So, A.M.C.: A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo–Tseng error bound property. Math. Program. 174(1), 327–358 (2019)

    MathSciNet  Google Scholar 

  52. Zhang, Y., Zhang, N., Sun, D., Toh, K.C.: An efficient Hessian based algorithm for solving large-scale sparse group LASSO problems. Math. Program. 179(1), 223–263 (2020)

    MathSciNet  Google Scholar 

  53. Zhou, G., Qi, L.: On the convergence of an inexact Newton-type method. Oper. Res. Lett. 34(6), 647–652 (2006)

    MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the editor and the referees for their insightful and constructive comments that led to improvement of the paper. This work is supported by National Natural Science Foundation of China (Nos. 12061013, 11601095), Natural Science Foundation of Guangxi Province (2016GXNSFBA380185), Training Plan of Thousands of Young and Middle-aged Backbone Teachers in Colleges and Universities of Guangxi, and Special Foundation for Guangxi Ba Gui Scholars.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Miantao Chao.

Additional information

Communicated by Alexey F. Izmailov.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shi, Z., Chao, M. Generalized Newton Method with Positive Definite Regularization for Nonsmooth Optimization Problems with Nonisolated Solutions. J Optim Theory Appl 201, 396–432 (2024). https://doi.org/10.1007/s10957-024-02402-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-024-02402-9

Keywords

Mathematics Subject Classification

Navigation