ByRDiE: Byzantine-Resilient Distributed Coordinate Descent for Decentralized Learning

被引:70
|
作者
Yang, Zhixiong [1 ]
Bajwa, Waheed U. [1 ]
机构
[1] Rutgers Univ New Brunswick, Dept Elect & Comp Engn, 94 Brett Rd, Piscataway, NJ 08854 USA
基金
美国国家科学基金会;
关键词
Training; Optimization; Machine learning; Distributed algorithms; Training data; Information processing; Machine learning algorithms; Byzantine failure; consensus; coordinate descent; decentralized learning; distributed optimization; empirical risk minimization; machine learning; CONSENSUS; OPTIMIZATION; ALGORITHM; ADMM;
D O I
10.1109/TSIPN.2019.2928176
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed machine learning algorithms enable learning of models from datasets that are distributed over a network without gathering the data at a centralized location. While efficient distributed algorithms have been developed under the assumption of faultless networks, failures that can render these algorithms nonfunctional occur frequently in the real world. This paper focuses on the problem of Byzantine failures, which are the hardest to safeguard against in distributed algorithms. While Byzantine fault tolerance has a rich history, existing work does not translate into efficient and practical algorithms for high-dimensional learning in fully distributed (also known as decentralized) settings. In this paper, an algorithm termed Byzantine-resilient distributed coordinate descent is developed and analyzed that enables distributed learning in the presence of Byzantine failures. Theoretical analysis (convex settings) and numerical experiments (convex and nonconvex settings) highlight its usefulness for high-dimensional distributed learning in the presence of Byzantine failures.
引用
收藏
页码:611 / 627
页数:17
相关论文
共 50 条
  • [1] BYRDIE: A BYZANTINE-RESILIENT DISTRIBUTED LEARNING ALGORITHM
    Yang, Zhixiong
    Bajwa, Waheed U.
    2018 IEEE DATA SCIENCE WORKSHOP (DSW), 2018, : 21 - 25
  • [2] BRIDGE: Byzantine-Resilient Decentralized Gradient Descent
    Fang, Cheng
    Yang, Zhixiong
    Bajwa, Waheed U.
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2022, 8 : 610 - 626
  • [3] Byzantine-Resilient Decentralized Stochastic Gradient Descent
    Guo, Shangwei
    Zhang, Tianwei
    Yu, Han
    Xie, Xiaofei
    Ma, Lei
    Xiang, Tao
    Liu, Yang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (06) : 4096 - 4106
  • [4] BYZANTINE-RESILIENT DECENTRALIZED COLLABORATIVE LEARNING
    Xu, Jian
    Huang, Shao-Lun
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5253 - 5257
  • [5] Byzantine-resilient decentralized network learning
    Yang, Yaohong
    Wang, Lei
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2024, 53 (02) : 349 - 380
  • [6] Data Encoding for Byzantine-Resilient Distributed Gradient Descent
    Data, Deepesh
    Song, Linqi
    Diggavi, Suhas
    2018 56TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2018, : 863 - 870
  • [7] Byzantine-Resilient Stochastic Gradient Descent for Distributed Learning: A Lipschitz-Inspired Coordinate-wise Median Approach
    Yang, Haibo
    Zhang, Xin
    Fang, Minghong
    Liu, Jia
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 5832 - 5837
  • [8] Byzantine-resilient distributed learning under constraints
    Ding, Dongsheng
    Wei, Xiaohan
    Yu, Hao
    Jovanovic, Mihailo R.
    2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 2260 - 2265
  • [9] BYZANTINE-RESILIENT DECENTRALIZED RESOURCE ALLOCATION
    Wang, Runhua
    Liu, Yaohua
    Ling, Qing
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5293 - 5297
  • [10] BYZANTINE-RESILIENT DECENTRALIZED TD LEARNING WITH LINEAR FUNCTION APPROXIMATION
    Wu, Zhaoxian
    Shen, Han
    Chen, Tianyi
    Ling, Qing
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5040 - 5044