Communication-efficient federated learning via personalized filter pruning

被引:0
|
作者
Min, Qi [1 ]
Luo, Fei [1 ]
Dong, Wenbo [1 ]
Gu, Chunhua [1 ]
Ding, Weichao [1 ]
机构
[1] East China Univ Sci & Technol, Sch Informat Sci & Engn, 130 Meilong Rd, Shanghai 200237, Peoples R China
关键词
Federated learning; Model compression; Filter pruning; Efficient training;
D O I
10.1016/j.ins.2024.121030
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the popularity of mobile devices and the continuous growth of interactive data, FL (Federated Learning) has gradually become an effective mean to address the problems of privacy leakage and data silos. However, due to the heterogeneity and imbalance of participants, FL faces many challenges, including model accuracy, security, heterogeneous devices and data, privacy preservation, as well as communication overhead and efficiency. To address challenges such as high communication overhead and low model accuracy in FL, we innovatively introduce model pruning into the FL framework and propose a personalized filter pruning-based FL method named PF2 2 Learning (Personalized Filter Pruning Federal Learning). This method achieves reasonable filter pruning for all local models by performing personalized pruning on each local model. Specifically, on the device side, we use a pruning strategy based on the norm geometric median, which also considers the order dependence between adjacent layers and evaluates the contribution of filters involved in pruning to optimize the pruning for local models at a unified pruning rate. On the server side, we calculate the unified pruning ratio of the model based on the contribution of participants' model filters and training errors to maintain the consistency of the participants' model structures. To validate the effectiveness of our proposed method, we conducted federated image classification tasks on ResNet-18, VGG-11, DenseNet-121, and InceptionNet-V1 models using CIFAR-10, FEMNIST, and ImageNet datasets. The experimental results show that PF2 2 Learning outperforms most FL pruning methods and exhibits better model performance and accuracy.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098
  • [2] Communication-Efficient and Personalized Federated Lottery Ticket Learning
    Seo, Sejin
    Ko, Seung-Woo
    Park, Jihong
    Kim, Seong-Lyun
    Bennis, Mehdi
    [J]. SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 581 - 585
  • [3] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [4] Communication-Efficient Personalized Federated Learning With Privacy-Preserving
    Wang, Qian
    Chen, Siguang
    Wu, Meng
    [J]. IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2374 - 2388
  • [5] DisPFL: Towards Communication-Efficient Personalized Federated Learning via Decentralized Sparse Training
    Dai, Rong
    Shen, Li
    He, Fengxiang
    Tian, Xinmei
    Tao, Dacheng
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] IoT Device Friendly and Communication-Efficient Federated Learning via Joint Model Pruning and Quantization
    Prakash, Pavana
    Ding, Jiahao
    Chen, Rui
    Qin, Xiaoqi
    Shu, Minglei
    Cui, Qimei
    Guo, Yuanxiong
    Pan, Miao
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (15): : 13638 - 13650
  • [7] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2022, 13 (01)
  • [8] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [9] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [10] Communication-Efficient and Model-Heterogeneous Personalized Federated Learning via Clustered Knowledge Transfer
    Cho, Yae Jee
    Wang, Jianyu
    Chirvolu, Tarun
    Joshi, Gauri
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2023, 17 (01) : 234 - 247