BAFL: Federated Learning with Base Ablation for Cost Effective Communication

被引:1
|
作者
Kundalwal, Mayank Kumar [1 ]
Saraswat, Anurag [1 ]
Mishra, Ishan [1 ]
Mishra, Deepak [1 ]
机构
[1] IIT Jodhpur, Jodhpur, Rajasthan, India
关键词
D O I
10.1109/ICPR56361.2022.9956684
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning is a distributed machine learning setting in which clients train a global model on their local data and share their knowledge with the server in form of the trained model while maintaining privacy of the data. The server aggregates clients' knowledge to create a generalized global model. Two major challenges faced in this process are data heterogeneity and high communication cost. We target the latter and propose a simple approach, BAFL (Federated Learning for Base Ablation) for cost effective communication in federated learning. In contrast to the common practice of employing model compression techniques to reduce the total communication cost, we propose a fine-tuning approach to leverage the feature extraction ability of layers at different depths of deep neural networks. We use a model pretrained on general-purpose large scale data as a global model. This helps in better weight initialization and reduces the total communication cost required for obtaining the generalized model. We achieve further cost reduction by focusing only on the layers responsible for semantic features (data specific information). The clients fine tune only top layers on their local data. Base layers are ablated while transferring the model and clients communicate parameters corresponding to the remaining layers. This results in reduction of communication cost per round without compromising the accuracy. We evaluate the proposed approach using VGG-16 and ResNet-50 models on datasets including WBC, FOOD-101, and CIFAR-10 and obtain up to two orders of reduction in total communication cost as compared to the conventional federated learning. We perform experiments in both IID and Non-IID settings and observe consistent improvements.
引用
收藏
页码:1922 / 1928
页数:7
相关论文
共 50 条
  • [1] BAFL: A Blockchain-Based Asynchronous Federated Learning Framework
    Feng, Lei
    Zhao, Yiqi
    Guo, Shaoyong
    Qiu, Xuesong
    Li, Wenjing
    Yu, Peng
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (05) : 1092 - 1103
  • [2] Cost-Effective Federated Learning Design
    Luo, Bing
    Li, Xiang
    Wang, Shiqiang
    Huang, Jianwei
    Tassiula, Leandros
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2021), 2021,
  • [3] BAFL: An Efficient Blockchain-Based Asynchronous Federated Learning Framework
    Xu, Chenhao
    Qu, Youyang
    Eklund, Peter W.
    Xiang, Yong
    Gao, Longxiang
    26TH IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (IEEE ISCC 2021), 2021,
  • [4] Communication Cost Reduction with Partial Structure in Federated Learning
    Kang, Dongseok
    Ahn, Chang Wook
    ELECTRONICS, 2021, 10 (17)
  • [5] Grey Wolf Optimizer for Reducing Communication Cost of Federated Learning
    Abasi, Ammar Kamal
    Aloqaily, Moayad
    Guizani, Mohsen
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1049 - 1054
  • [6] Cost-Effective Federated Learning in Mobile Edge Networks
    Luo, Bing
    Li, Xiang
    Wang, Shiqiang
    Huang, Jianwei
    Tassiulas, Leandros
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3606 - 3621
  • [7] Integrated Sensing, Communication, and Computing for Cost-effective Multimodal Federated Perception
    Chen, Ning
    Cheng, Zhipeng
    Fan, Xu wei
    Liu, Zhang
    Huang, Bangzhen
    Zhao, Yifeng
    Huang, Lianfen
    Du, Xiaojiang
    Guizani, Mohsen
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (08)
  • [8] SmartIdx: Reducing Communication Cost in Federated Learning by Exploiting the CNNs Structures
    Wu, Donglei
    Zou, Xiangyu
    Zhang, Shuyu
    Jin, Haoyu
    Xia, Wen
    Fang, Binxing
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4254 - 4262
  • [9] ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training
    Wang, Hui-Po
    Stich, Sebastian U.
    He, Yang
    Fritz, Mario
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Sparse Communication for Federated Learning
    Thonglek, Kundjanasith
    Takahashi, Keichi
    Ichikawa, Kohei
    Nakasan, Chawanat
    Leelaprute, Pattara
    Iida, Hajimu
    6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 1 - 8