Neuron Pruning-Based Federated Learning for Communication-Efficient Distributed Training

被引:0
|
作者
Guan, Jianfeng [1 ,2 ]
Wang, Pengcheng [1 ]
Yao, Su [3 ]
Zhang, Jing [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Natl Pilot Software Engn Sch, Beijing 100876, Peoples R China
[2] Beijing Univ Posts & Telecommun, Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[3] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Edge Computing; Federated Learning; Privacy Preserving; Neuron Pruning; Internet of Things; Traffic Identification;
D O I
10.1007/978-981-97-0859-8_4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Efficient and flexible cloud computing is widely used in distributed systems. However, in the Internet of Things (IoT) environment with heterogeneous capabilities, the performance of cloud computing may decline due to limited communication resources. As located closer to the end, edge computing is used to replace cloud computing to provide timely and stable services. To accomplish distributed system and privacy preserving, Federated Learning (FL) has been combined with edge computing. However, due to the large number of clients, the amount of data transmitted will also grow exponentially. How to reduce the communication overhead in FL is still a big problem. As a major method to reduce the communication overhead, compressing the transmission parameters can effectively reduce the communication overhead. However, the existing methods do not consider the possible internal relationship between neurons. In this paper, we propose Neuron Pruning-Based FL for communication-efficient distributed training, which is a model pruning method to compress model parameters transmitted in FL. In contrast to the previous methods, we use dimensionality reduction method as the importance factor of neurons, and take advantage of the correlation between them to carry out model pruning. Our analysis results show that NPBFL can reduce communication overhead while maintaining classification accuracy.
引用
收藏
页码:63 / 81
页数:19
相关论文
共 50 条
  • [21] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886
  • [22] FedBoost: Communication-Efficient Algorithms for Federated Learning
    Hamer, Jenny
    Mohri, Mehryar
    Suresh, Ananda Theertha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [23] Ternary Compression for Communication-Efficient Federated Learning
    Xu, Jinjin
    Du, Wenli
    Jin, Yaochu
    He, Wangli
    Cheng, Ran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1162 - 1176
  • [24] IoT Device Friendly and Communication-Efficient Federated Learning via Joint Model Pruning and Quantization
    Prakash, Pavana
    Ding, Jiahao
    Chen, Rui
    Qin, Xiaoqi
    Shu, Minglei
    Cui, Qimei
    Guo, Yuanxiong
    Pan, Miao
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (15): : 13638 - 13650
  • [25] Communication-Efficient Distributed Learning: An Overview
    Cao, Xuanyu
    Basar, Tamer
    Diggavi, Suhas
    Eldar, Yonina C.
    Letaief, Khaled B.
    Poor, H. Vincent
    Zhang, Junshan
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 851 - 873
  • [26] Communication-efficient federated continual learning for distributed learning system with Non-IID data
    Zhang, Zhao
    Zhang, Yong
    Guo, Da
    Zhao, Shuang
    Zhu, Xiaolin
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (02)
  • [27] Layer-Based Communication-Efficient Federated Learning with Privacy Preservation
    Lian, Zhuotao
    Wang, Weizheng
    Huang, Huakun
    Su, Chunhua
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 256 - 263
  • [28] Communication-efficient federated continual learning for distributed learning system with Non-IID data
    Zhao ZHANG
    Yong ZHANG
    Da GUO
    Shuang ZHAO
    Xiaolin ZHU
    Science China(Information Sciences), 2023, 66 (02) : 94 - 113
  • [29] Communication-efficient federated learning based on compressed sensing and ternary quantization
    Zheng, Jiali
    Tang, Jing
    Applied Intelligence, 2025, 55 (02)
  • [30] Communication-efficient federated continual learning for distributed learning system with Non-IID data
    Zhao Zhang
    Yong Zhang
    Da Guo
    Shuang Zhao
    Xiaolin Zhu
    Science China Information Sciences, 2023, 66