Neuron Pruning-Based Federated Learning for Communication-Efficient Distributed Training

被引:0
|
作者
Guan, Jianfeng [1 ,2 ]
Wang, Pengcheng [1 ]
Yao, Su [3 ]
Zhang, Jing [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Natl Pilot Software Engn Sch, Beijing 100876, Peoples R China
[2] Beijing Univ Posts & Telecommun, Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[3] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Edge Computing; Federated Learning; Privacy Preserving; Neuron Pruning; Internet of Things; Traffic Identification;
D O I
10.1007/978-981-97-0859-8_4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Efficient and flexible cloud computing is widely used in distributed systems. However, in the Internet of Things (IoT) environment with heterogeneous capabilities, the performance of cloud computing may decline due to limited communication resources. As located closer to the end, edge computing is used to replace cloud computing to provide timely and stable services. To accomplish distributed system and privacy preserving, Federated Learning (FL) has been combined with edge computing. However, due to the large number of clients, the amount of data transmitted will also grow exponentially. How to reduce the communication overhead in FL is still a big problem. As a major method to reduce the communication overhead, compressing the transmission parameters can effectively reduce the communication overhead. However, the existing methods do not consider the possible internal relationship between neurons. In this paper, we propose Neuron Pruning-Based FL for communication-efficient distributed training, which is a model pruning method to compress model parameters transmitted in FL. In contrast to the previous methods, we use dimensionality reduction method as the importance factor of neurons, and take advantage of the correlation between them to carry out model pruning. Our analysis results show that NPBFL can reduce communication overhead while maintaining classification accuracy.
引用
下载
收藏
页码:63 / 81
页数:19
相关论文
共 50 条
  • [31] Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing
    Chen L.
    Xiao D.
    Yu Z.
    Huang H.
    Li M.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2022, 59 (11): : 2395 - 2407
  • [32] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [33] Federated Learning with Autotuned Communication-Efficient Secure Aggregation
    Bonawitz, Keith
    Salehi, Fariborz
    Konecny, Jakub
    McMahan, Brendan
    Gruteser, Marco
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 1222 - 1226
  • [34] On the Design of Communication-Efficient Federated Learning for Health Monitoring
    Chu, Dong
    Jaafar, Wael
    Yanikomeroglu, Halim
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1128 - 1133
  • [35] Communication-Efficient Federated Learning For Massive MIMO Systems
    Mu, Yuchen
    Garg, Navneet
    Ratnarajah, Tharmalingam
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 578 - 583
  • [36] ALS Algorithm for Robust and Communication-Efficient Federated Learning
    Hurley, Neil
    Duriakova, Erika
    Geraci, James
    O'Reilly-Morgan, Diarmuid
    Tragos, Elias
    Smyth, Barry
    Lawlor, Aonghus
    PROCEEDINGS OF THE 2024 4TH WORKSHOP ON MACHINE LEARNING AND SYSTEMS, EUROMLSYS 2024, 2024, : 56 - 64
  • [37] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [38] Communication-Efficient Wireless Traffic Prediction with Federated Learning
    Gao, Fuwei
    Zhang, Chuanting
    Qiao, Jingping
    Li, Kaiqiang
    Cao, Yi
    MATHEMATICS, 2024, 12 (16)
  • [39] Communication-Efficient Design for Quantized Decentralized Federated Learning
    Chen, Li
    Liu, Wei
    Chen, Yunfei
    Wang, Weidong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1175 - 1188
  • [40] FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
    Chan, Yun Hin
    Ngai, Edith C. H.
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 207 - 214