当前位置: X-MOL 学术Optim. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Kernel $$\ell ^1$$ -norm principal component analysis for denoising
Optimization Letters ( IF 1.6 ) Pub Date : 2023-09-25 , DOI: 10.1007/s11590-023-02051-3
Xiao Ling , Anh Bui , Paul Brooks

In this paper we describe a method for denoising data using kernel principal component analysis (KPCA) that is able to recover preimages of the intrinsic variables in the feature space using a single line search along the gradient descent direction of its squared projection error. This method combines a projection-free preimage estimation algorithm with an \(\ell ^1\)-norm KPCA. These two stages provide distinct advantages over other KPCA preimage methods in the sense that they are insensitive to outliers and computationally efficient. The method can improve the results of a range of unsupervised learning tasks, such as denoising, and clustering. Numerical experiments in the Amsterdam Library of Object Images demonstrate that the proposed method performs better in terms of mean squared error than the \(\ell ^2\)-norm analogue, as well as in synthetic data. The proposed method is applied to different datasets and the results are reported.



中文翻译:

核 $$\ell ^1$$ - 去噪的范数主成分分析

在本文中,我们描述了一种使用核主成分分析(KPCA)对数据进行去噪的方法,该方法能够使用沿其平方投影误差的梯度下降方向的单线搜索来恢复特征空间中固有变量的原像。该方法将无投影原像估计算法与\(\ell ^1\) -norm KPCA 相结合。与其他 KPCA 原像方法相比,这两个阶段具有明显的优势,因为它们对异常值不敏感且计算效率高。该方法可以改善一系列无监督学习任务的结果,例如去噪和聚类。阿姆斯特丹物体图像库中的数值实验表明,所提出的方法在均方误差方面比之前的方法表现更好\(\ell ^2\) -范数模拟,以及合成数据。所提出的方法应用于不同的数据集并报告结果。

更新日期:2023-09-27
down
wechat
bug