Neuron Pruning-Based Federated Learning for Communication-Efficient Distributed Training

被引:0
|
作者
Guan, Jianfeng [1 ,2 ]
Wang, Pengcheng [1 ]
Yao, Su [3 ]
Zhang, Jing [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Natl Pilot Software Engn Sch, Beijing 100876, Peoples R China
[2] Beijing Univ Posts & Telecommun, Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[3] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Edge Computing; Federated Learning; Privacy Preserving; Neuron Pruning; Internet of Things; Traffic Identification;
D O I
10.1007/978-981-97-0859-8_4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Efficient and flexible cloud computing is widely used in distributed systems. However, in the Internet of Things (IoT) environment with heterogeneous capabilities, the performance of cloud computing may decline due to limited communication resources. As located closer to the end, edge computing is used to replace cloud computing to provide timely and stable services. To accomplish distributed system and privacy preserving, Federated Learning (FL) has been combined with edge computing. However, due to the large number of clients, the amount of data transmitted will also grow exponentially. How to reduce the communication overhead in FL is still a big problem. As a major method to reduce the communication overhead, compressing the transmission parameters can effectively reduce the communication overhead. However, the existing methods do not consider the possible internal relationship between neurons. In this paper, we propose Neuron Pruning-Based FL for communication-efficient distributed training, which is a model pruning method to compress model parameters transmitted in FL. In contrast to the previous methods, we use dimensionality reduction method as the importance factor of neurons, and take advantage of the correlation between them to carry out model pruning. Our analysis results show that NPBFL can reduce communication overhead while maintaining classification accuracy.
引用
收藏
页码:63 / 81
页数:19
相关论文
共 50 条
  • [1] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098
  • [2] Communication-Efficient Generalized Neuron Matching for Federated Learning
    Hu, Sixu
    Li, Qinbin
    He, Bingsheng
    [J]. PROCEEDINGS OF THE 52ND INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2023, 2023, : 254 - 263
  • [3] Communication-efficient federated learning via personalized filter pruning
    Min, Qi
    Luo, Fei
    Dong, Wenbo
    Gu, Chunhua
    Ding, Weichao
    [J]. INFORMATION SCIENCES, 2024, 678
  • [4] Neuron Specific Pruning for Communication Efficient Federated Learning
    Kumar, Gaurav
    Toshniwal, Durga
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4148 - 4152
  • [5] Communication-efficient federated learning with stagewise training strategy
    Cheng, Yifei
    Shen, Shuheng
    Liang, Xianfeng
    Liu, Jingchang
    Chen, Joya
    Zhang, Tie
    Chen, Enhong
    [J]. NEURAL NETWORKS, 2023, 167 : 460 - 472
  • [6] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [7] A Communication-Efficient Federated Text Classification Method Based on Parameter Pruning
    Huo, Zheng
    Fan, Yilin
    Huang, Yaxin
    [J]. MATHEMATICS, 2023, 11 (13)
  • [8] Communication-Efficient and Private Federated Learning with Adaptive Sparsity-Based Pruning on Edge Computing
    Song, Shijin
    Du, Sen
    Song, Yuefeng
    Zhu, Yongxin
    [J]. ELECTRONICS, 2024, 13 (17)
  • [9] Joint Model Pruning and Device Selection for Communication-Efficient Federated Edge Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    Shen, Lei
    Liu, Chonghe
    [J]. IEEE TRANSACTIONS ON COMMUNICATIONS, 2022, 70 (01) : 231 - 244
  • [10] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541