当前位置: X-MOL 学术Dokl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Algorithms with Gradient Clipping for Stochastic Optimization with Heavy-Tailed Noise
Doklady Mathematics ( IF 0.6 ) Pub Date : 2024-03-11 , DOI: 10.1134/s1064562423701144
M. Danilova

Abstract

This article provides a survey of the results of several research studies [12–14, 26], in which open questions related to the high-probability convergence analysis of stochastic first-order optimization methods under mild assumptions on the noise were gradually addressed. In the beginning, we introduce the concept of gradient clipping, which plays a pivotal role in the development of stochastic methods for successful operation in the case of heavy-tailed distributions. Next, we examine the importance of obtaining the high-probability convergence guarantees and their connection with in-expectation convergence guarantees. The concluding sections of the article are dedicated to presenting the primary findings related to minimization problems and the results of numerical experiments.



中文翻译:

用于重尾噪声随机优化的梯度裁剪算法

摘要

本文对多项研究结果进行了综述 [12-14, 26],其中逐步解决了与噪声温和假设下随机一阶优化方法的高概率收敛分析相关的开放性问题。首先,我们介绍梯度裁剪的概念,它在重尾分布情况下成功运行的随机方法的开发中发挥着关键作用。接下来,我们研究获得高概率收敛保证的重要性及其与预期收敛保证的联系。本文的结论部分致力于介绍与最小化问题相关的主要发现和数值实验的结果。

更新日期:2024-03-11
down
wechat
bug