Communication Efficient Distributed Learning with Feature Partitioned Data

被引:0
|
作者
Zhang, Bingwen [1 ]
Geng, Jun [2 ]
Xu, Weiyu [3 ]
Lai, Lifeng [4 ]
机构
[1] Worcester Polytech Inst, Dept Elect & Comp Engn, Worcester, MA 01609 USA
[2] Harbin Inst Tech, Sch Elect & Info Engn, Harbin, Peoples R China
[3] Univ Iowa, Dept Elect & Comp Engn, Iowa City, IA 52242 USA
[4] Univ Calif Davis, Dept Elect & Comp Engn, Davis, CA 95616 USA
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Distributed learning; Feature partitioned data; Communication efficiency; Inexact update; REGRESSION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One major bottleneck in the design of large scale distributed machine learning algorithms is the communication cost. In this paper, we propose and analyze a distributed learning scheme for reducing the amount of communication in distributed learning problems under the feature partition scenario. The motivating observation of our scheme is that, in the existing schemes for the feature partition scenario, large amount of data exchange is needed for calculating gradients. In our proposed scheme, instead of calculating the exact gradient at each iteration, we only calculate the exact gradient sporadically. We provide precise conditions to determine when to perform the exact update, and characterize the convergence rate and bounds for total iterations and communication iterations. We further test our algorithm on real data sets and show that the proposed scheme can substantially reduce the amount of data transferred between distributed nodes.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Efficient federated learning for distributed neuroimaging data
    Thapaliya, Bishal
    Ohib, Riyasat
    Geenjaar, Eloy
    Liu, Jingyu
    Calhoun, Vince
    Plis, Sergey M.
    FRONTIERS IN NEUROINFORMATICS, 2024, 18
  • [22] Distributed Feature Selection for Efficient Economic Big Data Analysis
    Zhao, Liang
    Chen, Zhikui
    Hu, Yueming
    Min, Geyong
    Jiang, Zhaohua
    IEEE TRANSACTIONS ON BIG DATA, 2018, 4 (02) : 164 - 176
  • [23] Distributed prediction from vertically partitioned data
    Skillicorn, D. B.
    McConnell, S. M.
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2008, 68 (01) : 16 - 36
  • [24] A classification paradigm for distributed vertically partitioned data
    Basak, J
    Kothari, R
    NEURAL COMPUTATION, 2004, 16 (07) : 1525 - 1544
  • [25] Local Stochastic ADMM for Communication-Efficient Distributed Learning
    ben Issaid, Chaouki
    Elgabli, Anis
    Bennis, Mehdi
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 1880 - 1885
  • [26] Communication-Efficient Distributed Cooperative Learning With Compressed Beliefs
    Toghani, Mohammad Taha
    Uribe, Cesar A.
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2022, 9 (03): : 1215 - 1226
  • [27] Communication-Efficient and Privacy-Aware Distributed Learning
    Gogineni, Vinay Chakravarthi
    Moradi, Ashkan
    Venkategowda, Naveen K. D.
    Werner, Stefan
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 705 - 720
  • [28] Ordered Gradient Approach for Communication-Efficient Distributed Learning
    Chen, Yicheng
    Sadler, Brian M.
    Blum, Rick S.
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [29] GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit S.
    Bennis, Mehdi
    Aggarwal, Vaneet
    Journal of Machine Learning Research, 2020, 21
  • [30] GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit S.
    Bennis, Mehdi
    Aggarwal, Vaneet
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21