BAFL: Federated Learning with Base Ablation for Cost Effective Communication

被引:1
|
作者
Kundalwal, Mayank Kumar [1 ]
Saraswat, Anurag [1 ]
Mishra, Ishan [1 ]
Mishra, Deepak [1 ]
机构
[1] IIT Jodhpur, Jodhpur, Rajasthan, India
来源
2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR) | 2022年
关键词
D O I
10.1109/ICPR56361.2022.9956684
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning is a distributed machine learning setting in which clients train a global model on their local data and share their knowledge with the server in form of the trained model while maintaining privacy of the data. The server aggregates clients' knowledge to create a generalized global model. Two major challenges faced in this process are data heterogeneity and high communication cost. We target the latter and propose a simple approach, BAFL (Federated Learning for Base Ablation) for cost effective communication in federated learning. In contrast to the common practice of employing model compression techniques to reduce the total communication cost, we propose a fine-tuning approach to leverage the feature extraction ability of layers at different depths of deep neural networks. We use a model pretrained on general-purpose large scale data as a global model. This helps in better weight initialization and reduces the total communication cost required for obtaining the generalized model. We achieve further cost reduction by focusing only on the layers responsible for semantic features (data specific information). The clients fine tune only top layers on their local data. Base layers are ablated while transferring the model and clients communicate parameters corresponding to the remaining layers. This results in reduction of communication cost per round without compromising the accuracy. We evaluate the proposed approach using VGG-16 and ResNet-50 models on datasets including WBC, FOOD-101, and CIFAR-10 and obtain up to two orders of reduction in total communication cost as compared to the conventional federated learning. We perform experiments in both IID and Non-IID settings and observe consistent improvements.
引用
收藏
页码:1922 / 1928
页数:7
相关论文
共 50 条
  • [31] FedComm: Federated Learning as a Medium for Covert Communication
    Hitaj, Dorjan
    Pagnotta, Giulio
    Hitaj, Briland
    Perez-Cruz, Fernando
    Mancini, Luigi V.
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (04) : 1695 - 1707
  • [32] Learning effective communication
    Kinch, RAH
    CANADIAN FAMILY PHYSICIAN, 1999, 45 : 1652 - 1652
  • [33] Federated Learning Under a Digital Communication Model
    Behmandpoor, Pourya
    Patrinos, Panagiotis
    Moonen, Marc
    IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2024, 5 : 312 - 320
  • [34] Communication and Storage Efficient Federated Split Learning
    Mu, Yujia
    Shen, Cong
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 2976 - 2981
  • [35] Redundancy Management in Federated Learning for Fast Communication
    Motamedi, Azadeh
    Yun, Sangseok
    Kang, Jae-Mo
    Ge, Yiqun
    Kim, Il-Min
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2023, 71 (11) : 6332 - 6347
  • [36] Model Compression for Communication Efficient Federated Learning
    Shah, Suhail Mohmad
    Lau, Vincent K. N.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5937 - 5951
  • [37] Federated Learning with Communication Delay in Edge Networks
    Lin, Frank Po-Chen
    Brinton, Christopher G.
    Michelusi, Nicolo
    2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [38] Decentralized Federated Learning under Communication Delays
    Lee, Na
    Shan, Hangguan
    Song, Meiyan
    Zhou, Yong
    Zhao, Zhongyuan
    Li, Xinyu
    Zhang, Zhaoyang
    2022 IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING (SECON WORKSHOPS), 2022, : 37 - 42
  • [39] Federated Learning and its Applications for Security and Communication
    Asif, Hafiz M.
    Karim, Mohamed Abdul
    Kausar, Firdous
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (08) : 320 - 324
  • [40] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    ALGORITHMS, 2022, 15 (08)