Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Federated multi-task learning (FMTL) has emerged as a promising framework for learning multiple tasks simultaneously with client-aware personalized models. While the majority of studies have focused on dealing with the non-independent and identically distributed (Non-IID) characteristics of client datasets, the issue of task heterogeneity has largely been overlooked. Dealing with task heterogeneity often requires complex models, making it impractical for federated learning in resource-constrained environments. In addition, the varying nature of these heterogeneous tasks introduces inductive biases, leading to interference during aggregation and potentially resulting in biased global models. To address these issues, we propose a hierarchical FMTL framework, referred to as
Cao X J, Li Z H, Sun G, Yu H F, Guizani M. Cross-silo heterogeneous model federated multitask learning. Knowledge-Based Systems, 2023, 265: 110347. DOI: 10.1016/j.knosys.2023.110347.
Wu Z Y, Sun S, Wang Y W, Liu M, Pan Q Y, Jiang X F, Gao B. FedICT: Federated multi-task distillation for multi-access edge computing. IEEE Trans. Parallel and Distributed Systems, 2024, 35(6): 1107–1121. DOI: 10.1109/TPDS.2023.3289444.
Duan M M, Liu D, Chen X Z, Liu R P, Tan Y J, Liang L. Self-balancing federated learning with global imbalanced data in mobile systems. IEEE Trans. Parallel and Distributed Systems, 2021, 32(1): 59–71. DOI: 10.1109/TPDS.2020.3009406.
Wu Q, Chen X, Zhou Z, Zhang J S. FedHome: Cloud-edge based personalized federated learning for in-home health monitoring. IEEE Trans. Mobile Computing, 2022, 21(8): 2818–2832. DOI: 10.1109/TMC.2020.3045266.
Tian Y Y S, Wan Y, Lyu L, Yao D Z, Jin H, Sun L C. FedBERT: When federated learning meets pre-training. ACM Trans. Intelligent Systems and Technology, 2022, 13(4): 66. DOI: 10.1145/3510033.
Wu C H, Wu F Z, Lyu L, Huang Y F, Xie X. Communication-efficient federated learning via knowledge distillation. Nature Communications, 2022, 13(1): Article No. 2032. DOI: 10.1038/s41467-022-29763-x.
Imteaj A, Thakker U, Wang S Q, Li J, Amini M H. A survey on federated learning for resource-constrained IoT devices. IEEE Internet of Things Journal, 2022, 9(1): 1–24. DOI: 10.1109/JIOT.2021.3095077.
Kato F, Cao Y, Yoshikawa M. Olive: Oblivious federated learning on trusted execution environment against the risk of sparsification. Proceedings of the VLDB Endowment, 2023, 16(10): 2404–2417. DOI: 10.14778/3603581.3603583.
Dwork C, Roth A. The algorithmic foundations of differential privacy. Foundations and Trends ® in Theoretical Computer Science, 2014, 9(3/4): 211–407. DOI: 10.1561/0400000042.
Robbins H, Monro S. A Stochastic Approximation Method. The Annals of Mathematical Statistics, 1951, 22(3): 400. DOI: 10.1214/aoms/1177729586.