当前位置: X-MOL 学术Comput. Geom. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Topological regularization via persistence-sensitive optimization
Computational Geometry ( IF 0.6 ) Pub Date : 2024-02-28 , DOI: 10.1016/j.comgeo.2024.102086
Arnur Nigmetov , Aditi Krishnapriyan , Nicole Sanderson , Dmitriy Morozov

Optimization, a key tool in machine learning and statistics, relies on regularization to reduce overfitting. Traditional regularization methods control a norm of the solution to ensure its smoothness. Recently, topological methods have emerged as a way to provide a more precise and expressive control over the solution, relying on persistent homology to quantify and reduce its roughness. All such existing techniques back-propagate gradients through the persistence diagram, which is a summary of the topological features of a function. Their downside is that they provide information only at the critical points of the function. We propose a method that instead builds on persistence-sensitive simplification and translates the required changes to the persistence diagram into changes on large subsets of the domain, including both critical and regular points. This approach enables a faster and more precise topological regularization, the benefits of which we illustrate with experimental evidence.

中文翻译:

通过持久性敏感优化进行拓扑正则化

优化是机器学习和统计中的关键工具,依赖正则化来减少过度拟合。传统的正则化方法控制解的范数以保证其平滑。最近,拓扑方法已经出现,作为一种对解决方案提供更精确和更具表现力的控制的方法,依靠持久的同源性来量化和降低其粗糙度。所有这些现有技术都通过持久图反向传播梯度,持久图是函数拓扑特征的总结。它们的缺点是仅在功能的关键点提供信息。我们提出了一种方法,该方法建立在持久性敏感的简化基础上,将对持久性图所需的更改转换为对域的大型子集(包括关键点和常规点)的更改。这种方法可以实现更快、更精确的拓扑正则化,我们用实验证据说明其好处。
更新日期:2024-02-28
down
wechat
bug