Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Federated learning has emerged as a distributed learning paradigm by training at each client and aggregating at a parameter server. System heterogeneity hinders stragglers from responding to the server in time with huge communication costs. Although client grouping in federated learning can solve the straggler problem, the stochastic selection strategy in client grouping neglects the impact of data distribution within each group. Besides, current client grouping approaches make clients suffer unfair participation, leading to biased performances for different clients. In order to guarantee the fairness of client participation and mitigate biased local performances, we propose a federated dynamic client selection method based on data representativity (FedSDR). FedSDR clusters clients into groups correlated with their own local computational efficiency. To estimate the significance of client datasets, we design a novel data representativity evaluation scheme based on local data distribution. Furthermore, the two most representative clients in each group are selected to optimize the global model. Finally, the DYNAMIC-SELECT algorithm updates local computational efficiency and data representativity states to regroup clients after periodic average aggregation. Evaluations on real datasets show that FedSDR improves client participation by 27.4%, 37.9%, and 23.3% compared with FedAvg, TiFL, and FedSS, respectively, taking fairness into account in federated learning. In addition, FedSDR surpasses FedAvg, FedGS, and FedMS by 21.32%, 20.4%, and 6.90%, respectively, in local test accuracy variance, balancing the performance bias of the global model across clients.
Xu J, Wang H Q. Client selection and bandwidth allocation in wireless federated learning networks: A long-term perspective. IEEE Trans. Wireless Communications , 2021, 20(2): 1188–1200. DOI: 10.1109/TWC.2020.3031503.
Wei K, Li J, Ding M, Ma C, Yang H H, Farokhi F, Jin S, Quek T Q S, Poor H V. Federated learning with differential privacy: Algorithms and performance analysis. IEEE Trans. Information Forensics and Security , 2020, 15: 3454–3469. DOI: 10.1109/TIFS.2020.2988575.
Chu X W, Jiang H B, Li B, Wang D, Wang W. Editorial: Advances in mobile, edge and cloud computing. Mobile Networks and Applications , 2022, 27(1): 219–221. DOI: 10. 1007/s11036-020-01654-9.
Hu C, Lu R, Wang D. FEVA: A federated video analytics architecture for networked smart cameras. IEEE Network , 2021, 35(6): 163–170. DOI: 10.1109/MNET.001.2100 261.
Gao H C, Thai M T, Wu J. When decentralized optimization meets federated learning. IEEE Network , 2023, pp.1–7. DOI: 10.1109/MNET.132.2200530.
Zhou Y P, Fu Y, Luo Z X, Hu M, Wu D, Sheng Q Z, Yu S. The role of communication time in the convergence of federated edge learning. IEEE Trans. Vehicular Technology , 2022, 71(3): 3241–3254. DOI: 10.1109/TVT.2022.3144 099.
Gong B Y, Xing T Z, Liu Z D, Xi W, Chen X J. Adaptive client clustering for efficient federated learning over Non-IID and imbalanced data. IEEE Trans. Big Data , 2022. DOI: 10.1109/TBDATA.2022.3167994.
Baghersalimi S, Teijeiro T, Atienza D, Aminifar A. Personalized real-time federated learning for epileptic seizure detection. IEEE Journal of Biomedical and Health Informatics , 2022, 26(2): 898–909. DOI: 10.1109/JBHI.2021.3096 127.
Mohammed I, Tabatabai S, Al-Fuqaha A, El Bouanani F, Qadir J, Qolomany B, Guizani M. Budgeted online selection of candidate IoT clients to participate in federated learning. IEEE Internet of Things Journal , 2021, 8(7): 5938–5952. DOI: 10.1109/JIOT.2020.3036157.
Giulini M, Menichetti R, Shell M S, Potestio R. An information-theory-based approach for optimal model reduction of biomolecules. Journal of Chemical Theory and Computation , 2020, 16(11): 6795–6813. DOI: 10.1021/acs.jctc.0c00676.