Skip to main content
Log in

Adaptive Variant of the Frank–Wolfe Algorithm for Convex Optimization Problems

  • Published:
Programming and Computer Software Aims and scope Submit manuscript

Abstract

In this paper, we investigate a variant of the Frank–Wolfe method for convex optimization problems with the adaptive selection of the step parameter corresponding to information about the smoothness of the objective function (the Lipschitz constant of the gradient). Theoretical estimates of the quality of the approximate solution provided by the method using adaptively selected parameters Lk are presented. For a class of problems on a convex feasible set with a convex objective function, the guaranteed convergence rate of the proposed method is sublinear. A special subclass of these problems (an objective function with the gradient dominance condition) is considered and the convergence rate of the method using adaptively selected parameters Lk is estimated. An important feature of the result obtained is the elaboration of the case where it is possible to guarantee, after the completion of the iteration, at least double reduction in the residual of the function. At the same time, the use of adaptively selected parameters in theoretical estimates makes the method applicable to both smooth and non-smooth problems, provided that the iteration termination criterion is met. For smooth problems, it can be proved that the theoretical estimates of the method are reliably optimal up to multiplication by a constant factor. Computational experiments are carried out and a comparison with two other algorithms is made to demonstrate the efficiency of the algorithm on a number of both smooth and non-smooth problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.

Similar content being viewed by others

REFERENCES

  1. Canon, M.D. and Cullum, C.D., A tight upper bound on the rate of convergence of Frank–Wolfe algorithm, SIAM J. Control, 1968, vol. 6, no. 4, pp. 509–516.

    Article  MathSciNet  MATH  Google Scholar 

  2. Bomze, I.M., Rinaldi, F., and Zeffiro, D., Frank–Wolfe and friends: A journey into projection-free first-order optimization methods, 4OR-Q. J. Oper. Res., 2021, vol. 19, pp. 313–345.

    Article  MATH  Google Scholar 

  3. Braun, G., Carderera, A., Combettes, C.W., Hassani, H., Karbasi, A., Mokhtari, A., and Pokutta, S., Conditional gradient methods. https://arxiv.org/pdf/2211.14103.pdf.

  4. Nesterov, Y., Complexity bounds for primal-dual methods minimizing the model of objective function, Math. Program., 2018, vol. 171, nos. 1–2, pp. 311–330.

    Article  MathSciNet  MATH  Google Scholar 

  5. Nesterov, Y., Universal gradient methods for convex optimization problems, Math. Program., 2015, vol. 152, pp. 381–404.

    Article  MathSciNet  MATH  Google Scholar 

  6. Pedregosa, F., Negiar, G., Askari, A., and Jaggi, M., Linearly convergent Frank–Wolfe with backtracking line-search, Proc. Int. Conf. Artificial Intelligence and Statistics, 2020, pp. 1–10.

  7. Polyak, B.T., Gradient methods for minimizing functionals, Zh. Vychisl. Mat. Mat. Fiz., 1963, pp. 643–653.

  8. Łojasiewicz, S., A topological property of real analytic subsets, Coll. du CNRS, 1963, pp. 87–89.

    MATH  Google Scholar 

  9. Karimi, H., Nutini, J., and Schmidt, M., Linear convergence of gradient and proximal-gradient methods under the Polyak–Łojasiewicz condition, Proc Eur. Conf. Machine Learning and Knowledge Discovery in Databases (ECML PKDD), Riva del Garda, Italy, 2016, pp. 795–811.

  10. Freund, R.M., Grigas, P., and Mazumder, R., An extended Frank–Wolfe method within face directions, and its application to low-rank matrix completion, SIAM J. Optim., 2017, vol. 27, no. 1, pp. 319–346.

    Article  MathSciNet  MATH  Google Scholar 

  11. 100 000 ratings and 3600 tag applications applied to 9000 movies by 600 users, Last updated September, 2018. https://grouplens.org/datasets/movielens.

  12. Vapnik, V., The Nature of Statistical Learning Theory, Springer, 2013.

    MATH  Google Scholar 

  13. Clarkson, K.L., Coresets, sparse greedy approximation, and the Frank–Wolfe algorithm, ACM Trans. Algorithms, 2010, vol. 6, no. 4, pp. 1–30.

    Article  MathSciNet  MATH  Google Scholar 

  14. Pima Indians Diabetes database. https://www.kaggle.com/datasets/uciml/pima-indians-diabetes-database.

  15. Ivanov, V.K., Vasin, V.V., and Tanana, V.P., Theory of Linear Ill-Posed Problems and Its Applications, Walter de Gruyter, 2013.

    MATH  Google Scholar 

  16. LIBSVM data: Classification (binary class). https://www.csie.ntu.edu.tw/cjlin/libsvmtools/datasets/binary.html.

  17. Levitin, E.S. and Polyak, B.T., Constrained minimization methods, Zh. Vychisl. Mat. Mat. Fiz., 1966, vol. 6, no. 5, pp. 787–823.

    MATH  Google Scholar 

  18. Candes, E.J. and Recht, B., Exact matrix completion via convex optimization, Found. Comput. Math., 2009, vol. 9, no. 6, pp. 717–772.

    Article  MathSciNet  MATH  Google Scholar 

  19. Combettes, C.W. and Pokutta, S., Complexity of linear minimization and projection on some sets, Oper. Res. Lett., 2021, vol. 49, no. 4, pp. 565–571.

    Article  MathSciNet  MATH  Google Scholar 

  20. Frank, M. and Wolfe, P., An algorithm for quadratic programming, Nav. Res. Logist. Q., 1956, vol. 3, nos. 1–2, pp. 95–110.

    Article  MathSciNet  Google Scholar 

Download references

Funding

This work was supported by the Russian Science Foundation, project no. 21-71-30005 (https://rscf.ru/project/21-71-30005).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to G. V. Aivazian, F. S. Stonyakin, D. A. Pasechnyk, M. S. Alkousa, A. M. Raigorodsky or I. V. Baran.

Ethics declarations

The authors declare that they have no conflicts of interest.

Additional information

Translated by Yu. Kornienko

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Aivazian, G.V., Stonyakin, F.S., Pasechnyk, D.A. et al. Adaptive Variant of the Frank–Wolfe Algorithm for Convex Optimization Problems. Program Comput Soft 49, 493–504 (2023). https://doi.org/10.1134/S0361768823060038

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0361768823060038

Navigation