AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Regular Paper

Joint Participant Selection and Learning Optimization for Federated Learning of Multiple Models in Edge Cloud

Wireless Networking and Sensing (Wang) Laboratory, Department of Computer and Information Sciences Temple University, Philadelphia, PA 19122, U.S.A.

A preliminary version of the paper was published in the Proceedings of MASS 2022.

Show Author Information

Abstract

To overcome the limitations of long latency and privacy concerns from cloud computing, edge computing along with distributed machine learning such as federated learning (FL), has gained much attention and popularity in academia and industry. Most existing work on FL over the edge mainly focuses on optimizing the training of one shared global model in edge systems. However, with the increasing applications of FL in edge systems, there could be multiple FL models from different applications concurrently being trained in the shared edge cloud. Such concurrent training of these FL models can lead to edge resource competition (for both computing and network resources), and further affect the FL training performance of each other. Therefore, in this paper, considering a multi-model FL scenario, we formulate a joint participant selection and learning optimization problem in a shared edge cloud. This joint optimization aims to determine FL participants and the learning schedule for each FL model such that the total training cost of all FL models in the edge cloud is minimized. We propose a multi-stage optimization framework by decoupling the original problem into two or three subproblems that can be solved respectively and iteratively. Extensive evaluation has been conducted with real-world FL datasets and models. The results have shown that our proposed algorithms can reduce the total cost efficiently compared with prior algorithms.

Electronic Supplementary Material

Download File(s)
JCST-2301-13074-Highlights.pdf (2.7 MB)

References

[1]
McMahan B, Moore E, Ramage D, Hampson S, Arcas B A Y. Communication-efficient learning of deep networks from decentralized data. In Proc. the 20th International Conference on Artificial Intelligence and Statistics, Apr. 2017, pp.1273–1282. DOI: 10.48550/arXiv.1602.05629.
[2]

Ji S X, Jiang W Q, Walid A, Li X. Dynamic sampling and selective masking for communication-efficient federated learning. IEEE Intelligent Systems, 2022, 37(2): 27–34. DOI: 10.1109/MIS.2021.3114610.

[3]

Sattler F, Wiedemann S, Muller K R, Samek W. Robust and communication-efficient federated learning from non-I. I. D. data. IEEE Trans. Neural Networks and Learning Systems, 2019, 31(9): 3400–3413. DOI: 10.1109/TNNLS.2019.2944481.

[4]

Lim W Y B, Luong N C, Hoang D T, Jiao Y T, Liang Y C, Yang Q, Niyato D, Miao C Y. Federated learning in mobile edge networks: A comprehensive survey. IEEE Communications Surveys & Tutorials, 2020, 22(3): 2031–2063. DOI: 10.1109/COMST.2020.2986024.

[5]
Liu L M, Zhang J, Song S H, Letaief K B. Client-edge-cloud hierarchical federated learning. In Proc. the 2020 IEEE International Conference on Communications, Jun. 2020. DOI: 10.1109/ICC40277.2020.9148862.
[6]

Wang S Q, Tuor T, Salonidis T, Leung K K, Makaya C, He T, Chan K. Adaptive federated learning in resource constrained edge computing systems. IEEE Journal on Selected Areas in Communications, 2019, 37(6): 1205–1221. DOI: 10.1109/JSAC.2019.2904348.

[7]
Nishio T, Yonetani R. Client selection for federated learning with heterogeneous resources in mobile edge. In Proc. the 2019 IEEE International Conference on Communications, May 2019. DOI: 10.1109/ICC.2019.8761315.
[8]

Luo S Q, Chen X, Wu Q, Zhou Z, Yu S. HFEL: Joint edge association and resource allocation for cost-efficient hierarchical federated edge learning. IEEE Trans. Wireless Communications, 2020, 19(10): 6535–6548. DOI: 10.1109/TWC.2020.3003744.

[9]
Jin Y B, Jiao L, Qian Z Z, Zhang S, Lu S L. Learning for learning: Predictive online control of federated learning with edge provisioning. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. DOI: 10.1109/INFOCOM42981.2021.9488733.
[10]
Meng Z Y, Xu H L, Chen M, Xu Y, Zhao Y M, Qiao C M. Learning-driven decentralized machine learning in resource-constrained wireless edge computing. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. DOI: 10.1109/INFOCOM42981.2021.9488817.
[11]
Wang Z Y, Xu H L, Liu J C, Huang H, Qiao C M, Zhao Y M. Resource-efficient federated learning with hierarchical aggregation in edge computing. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. DOI: 10.1109/INFOCOM42981.2021.9488 756.
[12]
Wei X L, Liu J Y, Shi X H, Wang Y. Participant selection for hierarchical federated learning in edge clouds. In Proc. the 2022 IEEE International Conference on Networking, Architecture and Storage, Oct. 2022. DOI: 10.1109/NAS55553.2022.9925313.
[13]
Liu J, Wei X, Liu X, Gao H, Wang Y. Group-based hierarchical federated learning: Convergence, group formation, and sampling. In Proc. International Conference on Parallel Processing, Aug. 2023. DOI: 10.1145/3605573.3605584.
[14]

Nguyen M N H, Tran N H, Tun Y K, Han Z, Hong C S. Toward multiple federated learning services resource sharing in mobile edge networks. IEEE Trans. Mobile Computing, 2023, 22(1): 541–555. DOI: 10.1109/TMC.2021.3085 979.

[15]
Wei X L, Liu J Y, Wang Y. Joint participant selection and learning scheduling for multi-model federated edge learning. In Proc. the 19th International Conference on Mobile Ad Hoc and Smart Systems, Oct. 2022, pp.537–545. DOI: 10.1109/MASS56207.2022.00081.
[16]

Yang Z H, Chen M Z, Saad W, Hong C S, Shikh-Bahaei M. Energy efficient federated learning over wireless communication networks. IEEE Trans. Wireless Communications, 2021, 20(3): 1935–1949. DOI: 10.1109/TWC.2020.3037 554.

[17]
Li L, Shi D, Hou R H, Li H, Pan M, Han Z. To talk or to work: Flexible communication compression for energy efficient federated learning over heterogeneous mobile edge devices. In Proc. the 2021 IEEE Conference on Computer Communications, May 2021. DOI: 10.1109/INFOCOM42981.2021.9488839.
[18]

Wang J Y, Pan J L, Esposito F, Calyam P, Yang Z C, Mohapatra P. Edge cloud offloading algorithms: Issues, methods, and perspectives. ACM Computing Surveys, 2020, 52(1): Article No. 2. DOI: 10.1145/3284387.

[19]

Li T, Qiu Z J, Cao L J, Cheng D Z, Wang W C, Shi X H, Wang Y. Privacy-preserving participant grouping for mobile social sensing over edge clouds. IEEE Trans. Network Science and Engineering, 2021, 8(2): 865–880. DOI: 10.1109/TNSE.2020.3020159.

[20]
Tan H S, Han Z H, Li X Y, Lau F C M. Online job dispatching and scheduling in edge-clouds. In Proc. the 2017 IEEE Conference on Computer Communications, May 2017. DOI: 10.1109/INFOCOM.2017.8057116.
[21]

Yang S, Li F, Trajanovski S, Chen X, Wang Y, Fu X M. Delay-aware virtual network function placement and routing in edge clouds. IEEE Trans. Mobile Computing, 2021, 20(2): 445–459. DOI: 10.1109/TMC.2019.2942306.

[22]

Wei X L, Rahman A B M M, Cheng D Z, Wang Y. Joint optimization across timescales: Resource placement and task dispatching in edge clouds. IEEE Trans. Cloud Computing, 2023, 11(1): 730–744. DOI: 10.1109/TCC.2021.3113 605.

[23]
Cho Y J, Wang J Y, Joshi G. Client selection in federated learning: Convergence analysis and power-of-choice selection strategies. arXiv: 2010.01243, 2020. https://arxiv.org/abs/2010.01243, Jul. 2023.
[24]
Li X, Huang K X, Yang W H, Wang S S, Zhang Z H. On the convergence of FedAvg on non-IID data. arXiv: 1907.02189, 2019. https://arxiv.org/abs/1907.02189, Jul. 2023.
[25]
Li Y Q, Li F, Chen L X, Zhu L H, Zhou P, Wang Y. Power of redundancy: Surplus client scheduling for federated learning against user uncertainties. IEEE Trans. Mobile Computing, 2023, 22(9): 5449–5462. DOI: 10.1109/TMC.2022.3178167.
[26]
Tran N H, Bao W, Zomaya A, Nguyen M N H, Hong C S. Federated learning over wireless networks: Optimization model design and analysis. In Proc. the 2019 IEEE Conference on Computer Communications, Apr. 29–May 2, 2019, pp.1387–1395. DOI: 10.1109/INFOCOM.2019.8737 464.
[27]
Jin Y B, Jiao L, Qian Z Z, Zhang S, Lu S L, Wang X L. Resource-efficient and convergence-preserving online participant selection in federated learning. In Proc. the 40th International Conference on Distributed Computing Systems, Nov. 29–Dec. 1, 2020, pp.606–616. DOI: 10.1109/ ICDCS47774.2020.00049.
[28]

Chen M Z, Yang Z H, Saad W, Yin C C, Poor H V, Cui S G. A joint learning and communications framework for federated learning over wireless networks. IEEE Trans. Wireless Communications, 2021, 20(1): 269–283. DOI: 10.1109/TWC.2020.3024629.

[29]
Mitchell S, Kean A, Mason A, O’Sullivan M, Phillips A, Peschiera F. PuLP 2.6. 0. https://pypi.org/project/PuLP/, July 2023.
[30]

Beal L D R, Hill D C, Martin R A, Hedengren J D. GEKKO optimization suite. Processes, 2018, 6(8): 106. DOI: 10.3390/pr6080106.

[31]
Lai P, He Q, Abdelrazek M, Chen F F, Hosking J, Grundy J, Yang Y. Optimal edge user allocation in edge computing with variable sized vector bin packing. In Proc. the 16th International Conference on Service-Oriented Computing, Nov. 2018, pp.230–245. DOI: 10.1007/978-3-030-03596-9_15.
[32]

Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E. Scikit-learn: Machine learning in Python. The Journal of Machine Learning Research, 2011, 12: 2825–2830.

[33]
Xiao H, Rasul K, Vollgraf R. Fashion-MNIST: A novel image dataset for benchmarking machine learning algorithms. arXiv: 1708.07747, 2017. https://arxiv.org/abs/1708.07747, Jul. 2023.
[34]
Warden P. Speech commands: A dataset for limited-vocabulary speech recognition. arXiv: 1804.03209, 2018. https://arxiv.org/abs/1804.03209, Jul. 2023.
[35]
Zhang X, Zhao J B, LeCun Y. Character-level convolutional networks for text classification. In Proc. the 28th Advances in Neural Information Processing Systems, Dec. 2015, pp.649–657. DOI: 10.5555/2969239.2969312.
[36]
Wei X, Fan L, Guo Y, Gong Y, Han Z, Wang Y. Quantum assisted scheduling algorithm for federated learning in distributed networks. In Proc. the 32nd International Conference on Computer Communications and Networks, Jul. 2023. DOI: 10.1109/ICCCN58024.2023.10230094.
Journal of Computer Science and Technology
Pages 754-772
Cite this article:
Wei X, Liu J, Wang Y. Joint Participant Selection and Learning Optimization for Federated Learning of Multiple Models in Edge Cloud. Journal of Computer Science and Technology, 2023, 38(4): 754-772. https://doi.org/10.1007/s11390-023-3074-4

346

Views

1

Crossref

1

Web of Science

1

Scopus

0

CSCD

Altmetrics

Received: 21 March 2023
Accepted: 31 July 2023
Published: 06 December 2023
© Institute of Computing Technology, Chinese Academy of Sciences 2023
Return