Skip to main content
Log in

Joint Participant Selection and Learning Optimization for Federated Learning of Multiple Models in Edge Cloud

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

To overcome the limitations of long latency and privacy concerns from cloud computing, edge computing along with distributed machine learning such as federated learning (FL), has gained much attention and popularity in academia and industry. Most existing work on FL over the edge mainly focuses on optimizing the training of one shared global model in edge systems. However, with the increasing applications of FL in edge systems, there could be multiple FL models from different applications concurrently being trained in the shared edge cloud. Such concurrent training of these FL models can lead to edge resource competition (for both computing and network resources), and further affect the FL training performance of each other. Therefore, in this paper, considering a multi-model FL scenario, we formulate a joint participant selection and learning optimization problem in a shared edge cloud. This joint optimization aims to determine FL participants and the learning schedule for each FL model such that the total training cost of all FL models in the edge cloud is minimized. We propose a multi-stage optimization framework by decoupling the original problem into two or three subproblems that can be solved respectively and iteratively. Extensive evaluation has been conducted with realworld FL datasets and models. The results have shown that our proposed algorithms can reduce the total cost efficiently compared with prior algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. McMahan B, Moore E, Ramage D, Hampson S, Arcas B A Y. Communication-efficient learning of deep networks from decentralized data. In Proc. the 20th International Conference on Artificial Intelligence and Statistics, Apr. 2017, pp.1273–1282. https://doi.org/10.48550/arXiv.1602.05629.

  2. Ji S X, Jiang W Q, Walid A, Li X. Dynamic sampling and selective masking for communication-efficient federated learning. IEEE Intelligent Systems, 2022, 37(2): 27–34. https://doi.org/10.1109/MIS.2021.3114610.

    Article  Google Scholar 

  3. Sattler F, Wiedemann S, Muller K R, Samek W. Robust and communication-efficient federated learning from non-I. I. D. data. IEEE Trans. Neural Networks and Learning Systems, 2019, 31(9): 3400–3413. https://doi.org/10.1109/TNNLS.2019.2944481.

    Article  MathSciNet  Google Scholar 

  4. Lim W Y B, Luong N C, Hoang D T, Jiao Y T, Liang Y C, Yang Q, Niyato D, Miao C Y. Federated learning in mobile edge networks: A comprehensive survey. IEEE Communications Surveys & Tutorials, 2020, 22(3): 2031–2063. https://doi.org/10.1109/COMST.2020.2986024.

    Article  Google Scholar 

  5. Liu L M, Zhang J, Song S H, Letaief K B. Client-edge-cloud hierarchical federated learning. In Proc. the 2020 EEE International Conference on Communications, Jun. 2020. https://doi.org/10.1109/ICC40277.2020.9148862.

  6. Wang S Q, Tuor T, Salonidis T, Leung K K, Makaya C, He T, Chan K. Adaptive federated learning in resource constrained edge computing systems. IEEE Journal on Selected Areas in Communications, 2019, 37(6): 1205–1221. https://doi.org/10.1109/JSAC.2019.2904348.

    Article  Google Scholar 

  7. Nishio T, Yonetani R. Client selection for federated learning with heterogeneous resources in mobile edge. In Proc. the 2019 IEEE International Conference on Communications, May 2019. https://doi.org/10.1109/ICC.2019.8761315.

  8. Luo S Q, Chen X, Wu Q, Zhou Z, Yu S. HFEL: Joint edge association and resource allocation for cost-efficient hierarchical federated edge learning. IEEE Trans. Wireless Communications, 2020, 19(10): 6535–6548. https://doi.org/10.1109/TWC.2020.3003744.

    Article  Google Scholar 

  9. Jin Y B, Jiao L, Qian Z Z, Zhang S, Lu S L. Learning for learning: Predictive online control of federated learning with edge provisioning. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. https://doi.org/10.1109/INFOCOM42981.2021.9488733.

  10. Meng Z Y, Xu H L, Chen M, Xu Y, Zhao Y M, Qiao C M. Learning-driven decentralized machine learning in resource-constrained wireless edge computing. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. https://doi.org/10.1109/INFOCOM42981.2021.9488817.

  11. Wang Z Y, Xu H L, Liu J C, Huang H, Qiao C M, Zhao Y M. Resource-efficient federated learning with hierarchical aggregation in edge computing. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. https://doi.org/10.1109/INFOCOM42981.2021.9488756.

  12. Wei X L, Liu J Y, Shi X H, Wang Y. Participant selection for hierarchical federated learning in edge clouds. In Proc. the 2022 IEEE International Conference on Net-working, Architecture and Storage, Oct. 2022. https://doi.org/10.1109/NAS55553.2022.9925313.

  13. Liu J, Wei X, Liu X, Gao H, Wang Y. Group-based hierarchical federated learning: Convergence, group formation, and sampling. In Proc. International Conference on Parallel Processing, Aug. 2023. https://doi.org/10.1145/3605573.3605584.

  14. Nguyen M N H, Tran N H, Tun Y K, Han Z, Hong C S. Toward multiple federated learning services resource sharing in mobile edge networks. IEEE Trans. Mobile Computing, 2023, 22(1): 541–555. https://doi.org/10.1109/TMC.2021.3085979.

    Article  Google Scholar 

  15. Wei X L, Liu J Y, Wang Y. Joint participant selection and learning scheduling for multi-model federated edge learning. In Proc. the 19th International Conference on Mobile Ad Hoc and Smart Systems, Oct. 2022, pp.537–545. https://doi.org/10.1109/MASS56207.2022.00081.

  16. Yang Z H, Chen M Z, Saad W, Hong C S, Shikh-Bahaei M. Energy efficient federated learning over wireless communication networks. IEEE Trans. Wireless Communications, 2021, 20(3): 1935–1949. https://doi.org/10.1109/TWC.2020.3037554.

    Article  Google Scholar 

  17. Li L, Shi D, Hou R H, Li H, Pan M, Han Z. To talk or to work: Flexible communication compression for energy efficient federated learning over heterogeneous mobile edge devices. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. https://doi.org/10.1109/INFOCOM42981.2021.9488839.

  18. Wang J Y, Pan J L, Esposito F, Calyam P, Yang Z C, Mohapatra P. Edge cloud offloading algorithms: Issues, methods, and perspectives. ACM Computing Surveys, 2020, 52(1): Article No. 2. https://doi.org/10.1145/3284387.

  19. Li T, Qiu Z J, Cao L J, Cheng D Z, Wang W C, Shi X H, Wang Y. Privacy-preserving participant grouping for mobile social sensing over edge clouds. IEEE Trans. Network Science and Engineering, 2021, 8(2): 865–880. https://doi.org/10.1109/TNSE.2020.3020159.

    Article  Google Scholar 

  20. Tan H S, Han Z H, Li X Y, Lau F C M. Online job dispatching and scheduling in edge-clouds. In Proc. the 2017 IEEE Conference on Computer Communications, May 2017. https://doi.org/10.1109/INFOCOM.2017.8057116.

  21. Yang S, Li F, Trajanovski S, Chen X, Wang Y, Fu X M. Delay-aware virtual network function placement and routing in edge clouds. IEEE Trans. Mobile Computing, 2021, 20(2): 445–459. https://doi.org/10.1109/TMC.2019.2942306.

    Article  Google Scholar 

  22. Wei X L, Rahman A B M M, Cheng D Z, Wang Y. Joint optimization across timescales: Resource placement and task dispatching in edge clouds. IEEE Trans. Cloud Computing, 2023, 11(1): 730–744. https://doi.org/10.1109/TCC.2021.3113605.

    Article  Google Scholar 

  23. Cho Y J, Wang J Y, Joshi G. Client selection in federated learning: Convergence analysis and power-of-choice selection strategies. arXiv: 2010.01243, 2020. https://arxiv.org/abs/2010.01243, Jul. 2023.

  24. Li X, Huang K X, Yang W H, Wang S S, Zhang Z H. On the convergence of FedAvg on non-IID data. arXiv: 1907.02189, 2019. https://arxiv.org/abs/1907.02189, Jul. 2023.

  25. Li Y Q, Li F, Chen L X, Zhu L H, Zhou P, Wang Y. Power of redundancy: Surplus client scheduling for federated learning against user uncertainties. IEEE Trans. Mobile Computing, 2023, 22(9): 5449–5462. https://doi.org/10.1109/TMC.2022.3178167.

    Article  Google Scholar 

  26. Tran N H, Bao W, Zomaya A, Nguyen M N H, Hong C S. Federated learning over wireless networks: Optimization model design and analysis. In Proc. the 2019 IEEE Conference on Computer Communications, Apr. 29–May 2, 2019, pp.1387–1395. https://doi.org/10.1109/INFOCOM.2019.8737464.

  27. Jin Y B, Jiao L, Qian Z Z, Zhang S, Lu S L, Wang X L. Resource-efficient and convergence-preserving online participant selection in federated learning. In Proc. the 40th International Conference on Distributed Computing Systems, Nov. 29–Dec. 1, 2020, pp.606–616. https://doi.org/10.1109/ICDCS47774.2020.00049.

  28. Chen M Z, Yang Z H, Saad W, Yin C C, Poor H V, Cui S G. A joint learning and communications framework for federated learning over wireless networks. IEEE Trans. Wireless Communications, 2021, 20(1): 269–283. https://doi.org/10.1109/TWC.2020.3024629.

    Article  Google Scholar 

  29. Mitchell S, Kean A, Mason A, O’Sullivan M, Phillips A, Peschiera F. PuLP 2.6. 0. https://pypi.org/project/PuLP/, July 2023.

  30. Beal L D R, Hill D C, Martin R A, Hedengren J D. GEKKO optimization suite. Processes, 2018, 6(8): 106. https://doi.org/10.3390/pr6080106.

    Article  Google Scholar 

  31. Lai P, He Q, Abdelrazek M, Chen F F, Hosking J, Grundy J, Yang Y. Optimal edge user allocation in edge computing with variable sized vector bin packing. In Proc. the 16th International Conference on Service-Oriented Computing, Nov. 2018, pp.230–245. https://doi.org/10.1007/978-3-030-03596-9_15.

  32. Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E. Scikit-learn: Machine learning in Python. The Journal of Machine Learning Research, 2011, 12: 2825–2830.

    MathSciNet  MATH  Google Scholar 

  33. Xiao H, Rasul K, Vollgraf R. Fashion-MNIST: A novel image dataset for benchmarking machine learning algorithms. arXiv: 1708.07747, 2017. https://arxiv.org/abs/1708.07747, Jul. 2023.

  34. Warden P. Speech commands: A dataset for limited-vocabulary speech recognition. arXiv: 1804.03209, 2018. https://arxiv.org/abs/1804.03209, Jul. 2023.

  35. Zhang X, Zhao J B, LeCun Y. Character-level convolutional networks for text classification. In Proc. the 28th Advances in Neural Information Processing Systems, Dec. 2015, pp.649–657. https://doi.org/10.5555/2969239.2969312.

  36. Wei X, Fan L, Guo Y, Gong Y, Han Z, Wang Y. Quantum assisted scheduling algorithm for federated learning in distributed networks. In Proc. the 32nd International Conference on Computer Communications and Networks, Jul. 2023. https://doi.org/10.1109/ICCCN58024.2023.10230094.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu Wang.

Supplementary Information

ESM 1

(PDF 2760 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wei, X., Liu, J. & Wang, Y. Joint Participant Selection and Learning Optimization for Federated Learning of Multiple Models in Edge Cloud. J. Comput. Sci. Technol. 38, 754–772 (2023). https://doi.org/10.1007/s11390-023-3074-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-023-3074-4

Keywords

Navigation