PDF (1.8 MB)
Collect
Submit Manuscript
Show Outline
Outline
Abstract
Keywords
Show full outline
Hide outline
Open Access | Just Accepted

Client to Server: Heterogeneous distribution Knowledge transfer for Federated Learning 

Rui Zhao1Peng Zhi1Xiao Yang1Zhihe Zhang2Gang Liu1Changyan Di1()Qingguo Zhou1()

School of Information Science and Engineering, Lanzhou University, Lanzhou 73000, China. 

School of Fundmental Science and Engineering, Waseda University, Tokyo 169-8050, Japan. Email:zzhihe@nsl.cs.waseda.ac.jp.

Show Author Information

Abstract

Federated learning (FL) is an emerging distributed machine learning paradigm that provides privacy
guarantees for training robust models on distributed clients. The primary challenge of FL is data heterogeneity,which slows down model convergence and degrades model performance. Knowledge Distillation has recently demonstrated effectiveness in addressing this challenge. However, these approaches neglect the statistical heterogeneity in local models and the uncertainty of the data distribution in the global model, which results in the ensemble knowledge cannot be fully utilized to guide local model learning. In this work, we propose an unsupervised knowledge distillation method migrating the local class-level pseudo-data sample scheme in the server for fine-tuning the global model. Specifically, we provide the conditional autoencoder for each client to maintain a dynamic generator in the server, which ensembles the client’s class-level information. The proposal produces an auxiliary dataset representing the global class-level distribution to regulate the local model as an inductive knowledge bias and employs unsupervised knowledge distillation to enhance the aggregated model’s performance. The extensive experiments show that our proposal significantly outperforms the current state-of-theart FL algorithms and can be integrated as a flexible plugin into existing FL optimization algorithms to enhance model performance.
 

Tsinghua Science and Technology
Cite this article:
Zhao R, Zhi P, Yang X, et al. Client to Server: Heterogeneous distribution Knowledge transfer for Federated Learning . Tsinghua Science and Technology, 2025, https://doi.org/10.26599/TST.2025.9010047
Metrics & Citations  
Article History
Copyright
Rights and Permissions
Return