当前位置: X-MOL 学术Optim. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Convergence analysis of block majorize-minimize subspace approach
Optimization Letters ( IF 1.6 ) Pub Date : 2023-09-16 , DOI: 10.1007/s11590-023-02055-z
Emilie Chouzenoux , Jean-Baptiste Fest

We consider the minimization of a differentiable Lipschitz gradient but non necessarily convex, function F defined on \({\mathbb {R}}^N\). We propose an accelerated gradient descent approach which combines three strategies, namely (i) a variable metric derived from the majorization-minimization principle; (ii) a subspace strategy incorporating information from the past iterates; (iii) a block alternating update. Under the assumption that F satisfies the Kurdyka–Łojasiewicz property, we give conditions under which the sequence generated by the resulting block majorize-minimize subspace algorithm converges to a critical point of the objective function, and we exhibit convergence rates for its iterates.



中文翻译:

块最大化-最小化子空间方法的收敛性分析

我们考虑在\({\mathbb {R}}^N\)上定义的可微 Lipschitz 梯度但不一定是凸函数F的最小化。我们提出了一种加速梯度下降方法,它结合了三种策略,即(i)从多数化-最小化原则导出的变量度量;(ii) 包含过去迭代信息的子空间策略;(iii) 块交替更新。假设F满足 Kurdyka–Łojasiewicz 性质,我们给出了由结果块多数最小化子空间算法生成的序列收敛到目标函数的临界点的条件,并且我们展示了其迭代的收敛率。

更新日期:2023-09-16
down
wechat
bug