当前位置: X-MOL 学术SIAM Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Survey and Review
SIAM Review ( IF 10.2 ) Pub Date : 2023-05-09 , DOI: 10.1137/23n975673
Marlis Hochbruck

SIAM Review, Volume 65, Issue 2, Page 329-329, May 2023.
A point process is called self-exciting if the arrival of an event increases the probability of similar events for some period of time. Typical examples include earthquakes, which frequently cause aftershocks due to increased geological tension in their region; raised intrusion rates in the vicinity of a burglary; retweets in social media incited by some provocative posting; or trading frenzies following a huge stock order. A Hawkes process is a point process that models self-excitement among time events. In contrast to a Markov chain (in which the probability of each event depends only on the state attained in the previous event), chances of arrival of events are increased for some time period after the initial arrival in a Hawkes process. The first Survey and Review paper in this issue, “Hawkes Processes Modeling, Inference, and Control: An Overview,” by Rafael Lima, discusses recent advances in Hawkes process modeling and inference. The parametric, nonparametric, deep learning, and reinforcement learning approaches are covered. Current research challenges for the topic and the real-world limitations of each approach are also addressed. The paper should be of interest to experts in the field, but it also aims to be suitable for newcomers. The second Survey and Review paper, “Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists,” by Laurent Condat, Daichi Kitahara, Andrés Contreras, and Akira Hirabayashi, is dedicated to the solution of convex nonsmooth optimization problems in high-dimensional spaces. The objective function $f$ is assumed to be a sum of simple convex functions $f_j$ with the property that the minimization problem for each $f_j$ is simple, but for $f$ it is hard. For nonsmooth functions, gradient-based optimization algorithms are infeasible. In proximal algorithms, the gradient is replaced by the so-called proximity operator. While closed forms of proximity operators are known for many functions of practical interest, there is no general closed form for the proximity operator of a sum of functions. Therefore, splitting algorithms handle the proximity operators of the functions $f_j$ individually. The paper provides a constructive and self-contained introduction to the class of proximal splitting algorithms. New variants of the algorithms under consideration are developed. Existing convergence results are revisited, unified, and, in some cases, improved. Reading the paper will be rewarding for anyone interested in high-dimensional nonsmooth convex optimization.


中文翻译:

调查和审查

SIAM Review,第 65 卷,第 2 期,第 329-329 页,2023 年 5 月。
如果一个事件的到来在一段时间内增加了类似事件发生的概率,则一个点过程被称为自激。典型的例子包括地震,由于其所在地区的地质张力增加而经常引起余震;提高了入室盗窃案附近的入侵率;因一些挑衅性的帖子而在社交媒体上转发;或在大量股票订单之后的交易狂潮。霍克斯过程是对时间事件之间的自激建模的点过程。与马尔可夫链(其中每个事件的概率仅取决于前一个事件所达到的状态)相反,在霍克斯过程中,事件到达的机会在初始到达后的一段时间内会增加。本期的第一篇调查和评论论文“霍克斯过程建模、推理和控制:An Overview”,作者 Rafael Lima,讨论了 Hawkes 过程建模和推理的最新进展。涵盖了参数、非参数、深度学习和强化学习方法。还解决了该主题当前的研究挑战以及每种方法在现实世界中的局限性。该论文应该引起该领域专家的兴趣,但它也旨在适合新手。由 Laurent Condat、Daichi Kitahara、Andrés Contreras 和 Akira Hirabayashi 撰写的第二篇 Survey and Review 论文“凸优化的近端分裂算法:新进展之旅”致力于解决凸非光滑优化问题在高维空间。目标函数 $f$ 被假定为简单凸函数 $f_j$ 的总和,其属性是每个 $f_j$ 的最小化问题很简单,但对于 $f$ 则很难。对于非光滑函数,基于梯度的优化算法是不可行的。在近端算法中,梯度被所谓的邻近算子取代。虽然对于许多实际感兴趣的函数来说,封闭形式的邻近算子是众所周知的,但是函数和的邻近运算符没有通用的封闭形式。因此,拆分算法分别处理函数 $f_j$ 的邻近运算符。本文提供了对近端分裂算法类的建设性和自包含的介绍。开发了正在考虑的算法的新变体。重新审视、统一现有收敛结果,并且,在某些情况下,有所改善。对于任何对高维非光滑凸优化感兴趣的人,阅读这篇论文都会有所收获。
更新日期:2023-05-08
down
wechat
bug