当前位置: X-MOL 学术Cluster Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A privacy-preserving federated learning framework for blockchain networks
Cluster Computing ( IF 4.4 ) Pub Date : 2024-03-02 , DOI: 10.1007/s10586-024-04273-1
Youssif Abuzied , Mohamed Ghanem , Fadi Dawoud , Habiba Gamal , Eslam Soliman , Hossam Sharara , Tamer ElBatt

In this paper we introduce a scalable, privacy-preserving, federated learning framework, coined FLoBC, based on the concept of distributed ledgers underlying blockchains. This is motivated by the rapid growth of data worldwide, especially decentralized data which calls for scalable, decenteralized machine learning models which is capable of preserving the privacy of the data of the participating users. Towards this objective, we first motivate and define the problem scope. We then introduce the proposed FLoBC system architecture hinging on a number of key pillars, namely parallelism, decentralization and node update synchronization. In particular, we examine a number of known node update synchronization policies and examine their performance merits and design trade-offs. Finally, we compare the proposed federated learning system to a centralized learning system baseline to demonstrate its performance merits. Our main finding in this paper is that our proposed decentralized learning framework was able to achieve comparable performance to a classic centralized learning system, while distributing the model training process across multiple nodes without sharing their actual data. This provides a scalable, privacy-preserving solution for training a variety of large machine learning models.

Graphical abstract



中文翻译:

区块链网络的隐私保护联邦学习框架

在本文中,我们基于区块链底层分布式账本的概念,介绍了一种可扩展、保护隐私的联邦学习框架,称为 FLoBC。这是由于全球数据的快速增长,尤其是去中心化数据,需要可扩展的、去中心化的机器学习模型,该模型能够保护参与用户数据的隐私。为了实现这一目标,我们首先激发并定义问题范围。然后,我们介绍了所提出的 FLoBC 系统架构,该架构依赖于许多关键支柱,即并行性、去中心化和节点更新同步。特别是,我们检查了许多已知的节点更新同步策略,并检查了它们的性能优点和设计权衡。最后,我们将所提出的联邦学习系统与集中式学习系统基线进行比较,以证明其性能优点。我们在本文中的主要发现是,我们提出的去中心化学习框架能够实现与经典集中式学习系统相当的性能,同时将模型训练过程分布在多个节点上,而不共享其实际数据。这为训练各种大型机器学习模型提供了可扩展的隐私保护解决方案。

图形概要

更新日期:2024-03-02
down
wechat
bug