Mutual Information Driven Federated Learning

被引:26
|
作者
Uddin, Md Palash [1 ]
Xiang, Yong [1 ]
Lu, Xuequan [1 ]
Yearwood, John [2 ]
Gao, Longxiang [1 ]
机构
[1] Deakin Univ, Deakin Blockchain Innovat Lab, Sch Informat Technol, Geelong, Vic 3220, Australia
[2] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
关键词
Data models; Training; Computational modeling; Servers; Mathematical model; Convergence; Analytical models; Distributed learning; federated learning; parallel optimization; data parallelism; information theory; mutual information; communication bottleneck; data heterogeneity; FEATURE-SELECTION;
D O I
10.1109/TPDS.2020.3040981
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this article, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10, ImageNet, and the clinical MIMIC-III datasets manifest that our method outperforms the state-of-the-art techniques in terms of both communication and testing performance.
引用
收藏
页码:1526 / 1538
页数:13
相关论文
共 50 条
  • [41] Optimistic Active Learning using Mutual Information
    Guo, Yuhong
    Greiner, Russ
    20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2007, : 823 - 829
  • [42] Personalized Federated Learning With Server-Side Information
    Song, Jaehun
    Oh, Min-Hwan
    Kim, Hyung-Sin
    IEEE ACCESS, 2022, 10 : 120245 - 120255
  • [43] Adversarial Mutual Information Learning for Network Embedding
    He, Dongxiao
    Zhai, Lu
    Li, Zhigang
    Jin, Di
    Yang, Liang
    Huang, Yuxiao
    Yu, Philip S.
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3321 - 3327
  • [44] Fair Representation Learning: An Alternative to Mutual Information
    Liu, Ji
    Li, Zenan
    Yao, Yuan
    Xu, Feng
    Ma, Xiaoxing
    Xu, Miao
    Tong, Hanghang
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1088 - 1097
  • [45] MILCS: A Mutual Information Learning Classifier System
    Smith, R. E.
    Jiang, Max Kun
    GECCO 2007: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOL 1 AND 2, 2007, : 1877 - 1877
  • [46] Towards Mutual Trust-Based Matching For Federated Learning Client Selection
    Wehbi, Osama
    Wahab, Omar Abdel
    Mourad, Azzam
    Otrok, Hadi
    Alkhzaimi, Hoda
    Guizani, Mohsen
    2023 INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING, IWCMC, 2023, : 1112 - 1117
  • [47] A Multi-Shuffler Framework to Establish Mutual Confidence for Secure Federated Learning
    Zhou, Zan
    Xu, Changqiao
    Wang, Mingze
    Kuang, Xiaohui
    Zhuang, Yirong
    Yu, Shui
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2023, 20 (05) : 4230 - 4244
  • [48] HFML: heterogeneous hierarchical federated mutual learning on non-IID data
    Li, Yang
    Li, Jie
    Li, Kan
    ANNALS OF OPERATIONS RESEARCH, 2023,
  • [49] A Cluster-Driven Adaptive Training Approach for Federated Learning
    Jeong, Younghwan
    Kim, Taeyoon
    SENSORS, 2022, 22 (18)
  • [50] User-Driven Innovation as Mutual but Asymmetrical Learning
    Marie, Anne
    Christiansen, Ellen
    INTERNATIONAL JOURNAL OF TECHNOLOGY AND HUMAN INTERACTION, 2009, 5 (03) : 1 - 12