Communication Efficient Distributed Learning with Feature Partitioned Data

被引:0
|
作者
Zhang, Bingwen [1 ]
Geng, Jun [2 ]
Xu, Weiyu [3 ]
Lai, Lifeng [4 ]
机构
[1] Worcester Polytech Inst, Dept Elect & Comp Engn, Worcester, MA 01609 USA
[2] Harbin Inst Tech, Sch Elect & Info Engn, Harbin, Peoples R China
[3] Univ Iowa, Dept Elect & Comp Engn, Iowa City, IA 52242 USA
[4] Univ Calif Davis, Dept Elect & Comp Engn, Davis, CA 95616 USA
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Distributed learning; Feature partitioned data; Communication efficiency; Inexact update; REGRESSION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One major bottleneck in the design of large scale distributed machine learning algorithms is the communication cost. In this paper, we propose and analyze a distributed learning scheme for reducing the amount of communication in distributed learning problems under the feature partition scenario. The motivating observation of our scheme is that, in the existing schemes for the feature partition scenario, large amount of data exchange is needed for calculating gradients. In our proposed scheme, instead of calculating the exact gradient at each iteration, we only calculate the exact gradient sporadically. We provide precise conditions to determine when to perform the exact update, and characterize the convergence rate and bounds for total iterations and communication iterations. We further test our algorithm on real data sets and show that the proposed scheme can substantially reduce the amount of data transferred between distributed nodes.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] Communication-Efficient Quantum Algorithm for Distributed Machine Learning
    Tang, Hao
    Li, Boning
    Wang, Guoqing
    Xu, Haowei
    Li, Changhao
    Barr, Ariel
    Cappellaro, Paola
    Li, Ju
    PHYSICAL REVIEW LETTERS, 2023, 130 (15)
  • [32] Communication-Efficient and Byzantine-Robust Distributed Learning
    Ghosh, Avishek
    Maity, Raj Kumar
    Kadhe, Swanand
    Mazumdar, Arya
    Ramchandran, Kannan
    2020 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2020,
  • [33] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [34] Communication-Efficient and Resilient Distributed Q-Learning
    Xie, Yijing
    Mou, Shaoshuai
    Sundaram, Shreyas
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (03) : 3351 - 3364
  • [35] Communication-Efficient Distributed Learning of Discrete Probability Distributions
    Diakonikolas, Ilias
    Grigorescu, Elena
    Li, Jerry
    Natarajan, Abhiram
    Onak, Krzysztof
    Schmidt, Ludwig
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [36] Mobile Agent Model with Efficient Communication for Distributed Data Mining
    Kavitha, S.
    Kumar, K. Senthil
    Anandam, K. V. Arul
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON INTERNET COMPUTING AND INFORMATION COMMUNICATIONS (ICICIC GLOBAL 2012), 2014, 216 : 421 - 429
  • [37] Communication-efficient exact clustering of distributed streaming data
    Tran, Dang-Hoan
    Sattler, Kai-Uwe
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2013, 7971 : 421 - 436
  • [38] Communication-Efficient Exact Clustering of Distributed Streaming Data
    Tran, Dang-Hoan
    Sattler, Kai-Uwe
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS - ICCSA 2013, PT V, 2013, 7975 : 421 - 436
  • [39] Efficient Learning of Sparse, Distributed, Convolutional Feature Representations for Object Recognition
    Sohn, Kihyuk
    Jung, Dae Yon
    Lee, Honglak
    Hero, Alfred O., III
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 2643 - 2650
  • [40] Optimal and Efficient Distributed Online Learning for Big Data
    Sayin, Muhammed O.
    Vanli, N. Denizcan
    Delibalta, Ibrahim
    Kozat, Suleyman S.
    2015 IEEE INTERNATIONAL CONGRESS ON BIG DATA - BIGDATA CONGRESS 2015, 2015, : 126 - 133