A Federated Learning Approach to Minimize Communication Rounds Using Noise Rectification

被引:0
|
作者
Mishra, Rahul [1 ]
Gupta, Hari Prabhat [2 ]
机构
[1] DA IICT Gandhinagar, Gandhinagar, India
[2] IIT BHU Varanasi, Varanasi, Uttar Pradesh, India
关键词
Communication rounds; federated learning;
D O I
10.1109/WCNC57260.2024.10570893
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is a distributed training framework that ensures data privacy and reduces communication overhead to train a shared model among multiple participants. Noise in the datasets and communication channels diminishes performance and increases communication rounds for convergence. This paper proposes a federated learning approach to rectify noise in local datasets and communication channels for reducing communication rounds. We employ two filters: one to rectify noise in the dataset and the other for the communication channel. We also consider two types of participants: one uses a filter on the dataset, and the other uses both the dataset and the communication channel. We first derive the expression to estimate the communication rounds for convergence. Next, we determine the number of participants under each type. Finally, we perform the experiments to verify the effectiveness of the proposed work on existing datasets and compare them with state-of-the-art techniques.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] FedPSO: Federated Learning Using Particle Swarm Optimization to Reduce Communication Costs
    Park, Sunghwan
    Suh, Yeryoung
    Lee, Jaewoo
    SENSORS, 2021, 21 (02) : 1 - 13
  • [42] Adaptive client and communication optimizations in Federated Learning
    Wu, Jiagao
    Wang, Yu
    Shen, Zhangchi
    Liu, Linfeng
    INFORMATION SYSTEMS, 2023, 116
  • [43] Communication overhead reduction in federated learning: a review
    Nariman, Goran Saman
    Hamarashid, Hozan Khalid
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2025, 19 (02) : 185 - 216
  • [44] CMFL: Mitigating Communication Overhead for Federated Learning
    Wang, Luping
    Wang, Wei
    Li, Bo
    2019 39TH IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2019), 2019, : 954 - 964
  • [45] Federated Contrastive Learning for Personalized Semantic Communication
    Wang, Yining
    Ni, Wanli
    Yi, Wenqiang
    Xu, Xiaodong
    Zhang, Ping
    Nallanathan, Arumugam
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (08) : 1875 - 1879
  • [46] Evaluating the Communication Efficiency in Federated Learning Algorithms
    Asad, Muhammad
    Moustafa, Ahmed
    Ito, Takayuki
    Aslam, Muhammad
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 552 - 557
  • [47] FedComm: Federated Learning as a Medium for Covert Communication
    Hitaj, Dorjan
    Pagnotta, Giulio
    Hitaj, Briland
    Perez-Cruz, Fernando
    Mancini, Luigi V.
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (04) : 1695 - 1707
  • [48] Vehicular communication using federated learning empowered chimp optimization (FLECO) algorithm
    Gupta, Ruchi
    Alam, Tanweer
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (29) : 72321 - 72356
  • [49] Communication-Efficient Device Scheduling for Federated Learning Using Stochastic Optimization
    Perazzone, Jake
    Wang, Shiqiang
    Ji, Mingyue
    Chan, Kevin S.
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2022), 2022, : 1449 - 1458
  • [50] Communication-Efficient Federated Learning Using Censored Heavy Ball Descent
    Chen, Yicheng
    Blum, Rick S.
    Sadler, Brian M.
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2022, 8 : 983 - 996