Mutual Information Driven Federated Learning

被引:26
|
作者
Uddin, Md Palash [1 ]
Xiang, Yong [1 ]
Lu, Xuequan [1 ]
Yearwood, John [2 ]
Gao, Longxiang [1 ]
机构
[1] Deakin Univ, Deakin Blockchain Innovat Lab, Sch Informat Technol, Geelong, Vic 3220, Australia
[2] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
关键词
Data models; Training; Computational modeling; Servers; Mathematical model; Convergence; Analytical models; Distributed learning; federated learning; parallel optimization; data parallelism; information theory; mutual information; communication bottleneck; data heterogeneity; FEATURE-SELECTION;
D O I
10.1109/TPDS.2020.3040981
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this article, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10, ImageNet, and the clinical MIMIC-III datasets manifest that our method outperforms the state-of-the-art techniques in terms of both communication and testing performance.
引用
收藏
页码:1526 / 1538
页数:13
相关论文
共 50 条
  • [1] Federated Learning Based on Mutual Information Clustering for Wireless Traffic Prediction
    Zhang, Jianwei
    Hu, Xinhua
    Cai, Zengyu
    Zhu, Liang
    Feng, Yuan
    ELECTRONICS, 2023, 12 (21)
  • [2] A mutual information based federated learning framework for edge computing networks
    Chen, Naiyue
    Li, Yinglong
    Liu, Xuejun
    Zhang, Zhenjiang
    COMPUTER COMMUNICATIONS, 2021, 176 (176) : 23 - 30
  • [3] Model agnostic federated mutual learning
    Zhou W.
    Li Y.
    Chen S.
    Ding B.
    Guofang Keji Daxue Xuebao/Journal of National University of Defense Technology, 2023, 45 (03): : 118 - 126
  • [4] Federated Learning with Privacy-Preserving Active Learning: A Min-Max Mutual Information Approach
    Alsulaimawi, Zahir (alsulaiz@oregonstate.edu), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [5] Federated Semantic Learning Driven by Information Bottleneck for Task-Oriented Communications
    Wei, Hao
    Ni, Wanli
    Xu, Wenjun
    Wang, Fengyu
    Niyato, Dusit
    Zhang, Ping
    IEEE COMMUNICATIONS LETTERS, 2023, 27 (10) : 2652 - 2656
  • [6] Minimum Gaussian Noise Variance of Federated Learning in the Presence of Mutual Information Based Differential Privacy
    He, Hua
    He, Zheng
    IEEE ACCESS, 2023, 11 : 111212 - 111225
  • [7] DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION
    Huang, Yue
    Kong, Lanju
    Li, Qingzhong
    Zhang, Baochen
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 342 - 347
  • [8] Decentralized Federated Learning via Mutual Knowledge Transfer
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (02) : 1136 - 1147
  • [9] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
  • [10] Mutual Information Driven Equivariant Contrastive Learning for 3D Action Representation Learning
    Lin, Lilang
    Zhang, Jiahang
    Liu, Jiaying
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 1883 - 1897