Logit Calibration for Non-IID and Long-Tailed Data in Federated Learning
journal contribution
posted on 2024-11-17, 13:35authored byHuan Wang, Lijuan Wang, Jun Shen
Federated learning (FL) strives to enable collaborative training of deep models on the distributed clients of different data without centrally aggregating raw data and hence improving data privacy. Nevertheless, a central challenge in training classification models in the federated system is learning with non-IID data. Most of the existing work is dedicated to eliminating the heterogeneous influence of non-IID data in a federated system. However, in many real-world FL applications, the co-occurrence of data heterogeneity and long-tailed distribution is unavoidable. The universal class distribution is long-tailed, causing them to become easily biased towards head classes, which severely harms the global model performance. In this work, we also discovered an intriguing fact that the classifier logit vector (i.e., pre-softmax output) introduces a heterogeneity drift during the learning process of local training and global optimization, which harms the convergence as well as model performance. Therefore, motivated by the above finding, we propose a novel logit calibration FL method to solve the joint problem of non-IID and long-tailed data in federated learning, called Federated Learning with Logit Calibration (FedLC). First, we presented a method to mitigate the local update drift by calculating the Wasserstein distance among adjacent client logits and then aggregating similar clients to regulate local training. Second, based on the model ensemble, a new distillation method with logit calibration and class weighting was proposed by exploiting the diversity of local models trained on heterogeneous data, which effectively alleviates the global drift problem under long-tailed distribution. Finally, we evaluated FedLC using a highly non-IID and long-tailed experimental setting, comprehensive experiments on several benchmark datasets demonstrated that FedLC achieved superior performance compared with state-of-the-art FL methods, which fully illustrated the effectiveness of the logit calibration strategy.
Funding
National Natural Science Foundation of China (61672415)
History
Journal title
Proceedings - 20th IEEE International Symposium on Parallel and Distributed Processing with Applications, 12th IEEE International Conference on Big Data and Cloud Computing, 12th IEEE International Conference on Sustainable Computing and Communications and 15th IEEE International Conference on Social Computing and Networking, ISPA/BDCloud/SocialCom/SustainCom 2022