当前位置: X-MOL 学术J. Parallel Distrib. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A distributed learning based on robust diffusion SGD over adaptive networks with noisy output data
Journal of Parallel and Distributed Computing ( IF 3.8 ) Pub Date : 2024-03-26 , DOI: 10.1016/j.jpdc.2024.104883
Fatemeh Barani , Abdorreza Savadi , Hadi Sadoghi Yazdi

Outliers and noises are unavoidable factors that cause performance of the distributed learning algorithms to be severely reduced. Developing a robust algorithm is vital in applications such as system identification and forecasting stock market, in which noise on the desired signals may intensely divert the solutions. In this paper, we propose a Robust Diffusion Stochastic Gradient Descent (RDSGD) algorithm based on the pseudo-Huber loss function which can significantly suppress the effect of Gaussian and non-Gaussian noises on estimation performances in the adaptive networks. Performance and convergence behavior of RDSGD are assessed in presence of the -stable and Mixed-Gaussian noises in the stationary and non-stationary environments. Simulation results show that the proposed algorithm can achieve both higher convergence rate and lower steady-state misadjustment than the conventional diffusion algorithms and several robust algorithms.

中文翻译:

基于带有噪声输出数据的自适应网络的鲁棒扩散 SGD 的分布式学习

异常值和噪声是导致分布式学习算法性能严重下降的不可避免的因素。开发强大的算法对于系统识别和预测股票市场等应用至关重要,在这些应用中,所需信号上的噪声可能会严重改变解决方案。在本文中,我们提出了一种基于伪Huber损失函数的鲁棒扩散随机梯度下降(RDSGD)算法,该算法可以显着抑制高斯和非高斯噪声对自适应网络估计性能的影响。 RDSGD 的性能和收敛行为在静态和非静态环境中存在稳定和混合高斯噪声的情况下进行评估。仿真结果表明,与传统的扩散算法和几种鲁棒算法相比,该算法能够实现更高的收敛速度和更低的稳态失调。
更新日期:2024-03-26
down
wechat
bug