Distributed Learning of Fully Connected Neural Networks using Independent Subnet Training

被引:11
|
作者
Yuan, Binhang [1 ]
Wolfe, Cameron R. [1 ]
Dun, Chen [1 ]
Tang, Yuxin [1 ]
Kyrillidis, Anastasios [1 ]
Jermaine, Chris [1 ]
机构
[1] Rice Univ, Houston, TX 77251 USA
来源
PROCEEDINGS OF THE VLDB ENDOWMENT | 2022年 / 15卷 / 08期
关键词
ALGORITHMS;
D O I
10.14778/3529337.3529343
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distributed machine learning (ML) can bring more computational resources to bear than single-machine learning, thus enabling reductions in training time. Distributed learning partitions models and data over many machines, allowing model and dataset sizes beyond the available compute power and memory of a single machine. In practice though, distributed ML is challenging when distribution is mandatory, rather than chosen by the practitioner. In such scenarios, data could unavoidably be separated among workers due to limited memory capacity per worker or even because of data privacy issues. There, existing distributed methods will utterly fail due to dominant transfer costs across workers, or do not even apply. We propose a new approach to distributed fully connected neural network learning, called independent subnet training (IST), to handle these cases. In IST, the original network is decomposed into a set of narrow subnetworks with the same depth. These subnetworks are then trained locally before parameters are exchanged to produce new subnets and the training cycle repeats. Such a naturally lmodel parallelz approach limits memory usage by storing only a portion of network parameters on each device. Additionally, no requirements exist for sharing data between workers (i.e., subnet training is local and independent) and communication volume and frequency are reduced by decomposing the original network into independent subnets. These properties of IST can cope with issues due to distributed data, slow interconnects, or limited device memory, making IST a suitable approach for cases of mandatory distribution. We show experimentally that IST results in training times that are much lower than common distributed learning approaches.
引用
收藏
页码:1581 / 1590
页数:10
相关论文
共 50 条
  • [41] TRAINING FULLY RECURRENT NEURAL NETWORKS WITH COMPLEX WEIGHTS
    KECHRIOTIS, G
    MANOLAKOS, ES
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING, 1994, 41 (03): : 235 - 238
  • [42] Evaluating EAS Directions from TAIGA HiSCORE Data Using Fully Connected Neural Networks
    A. P. Kryukov
    S. P. Polyakov
    Yu. Yu. Dubenskaya
    E. O. Gres
    E. B. Postnikov
    P. A. Volchugov
    D. P. Zhurov
    Moscow University Physics Bulletin, 2024, 79 (Suppl 2) : S724 - S730
  • [43] Structural Damage Identification of Composite Rotors Based on Fully Connected Neural Networks and Convolutional Neural Networks
    Scholz, Veronika
    Winkler, Peter
    Hornig, Andreas
    Gude, Maik
    Filippatos, Angelos
    SENSORS, 2021, 21 (06) : 1 - 15
  • [44] Training fully connected networks with resistive memories: impact of device failures
    Romero, Louis P.
    Ambrogio, Stefano
    Giordano, Massimo
    Cristiano, Giorgio
    Bodini, Martina
    Narayanan, Pritish
    Tsai, Hsinyu
    Shelby, Robert M.
    Burr, Geoffrey W.
    FARADAY DISCUSSIONS, 2019, 213 : 371 - 391
  • [45] Image enhancement using deep-learning fully connected neural network mean filter
    Lu, Ching-Ta
    Wang, Ling-Ling
    Shen, Jun-Hong
    Lin, Jia-An
    JOURNAL OF SUPERCOMPUTING, 2021, 77 (03): : 3144 - 3164
  • [46] Image enhancement using deep-learning fully connected neural network mean filter
    Ching-Ta Lu
    Ling-Ling Wang
    Jun-Hong Shen
    Jia-An Lin
    The Journal of Supercomputing, 2021, 77 : 3144 - 3164
  • [47] Estimation of a regression function on a manifold by fully connected deep neural networks
    Kohler, Michael
    Langer, Sophie
    Reif, Ulrich
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2023, 222 : 160 - 181
  • [48] Modeling Dynamic Hysteresis through Fully Connected Cascade Neural Networks
    Laudani, Antonino
    Lozito, Gabriele Maria
    Fulginei, Francesco Riganti
    Salvini, Alessandro
    2016 IEEE 2ND INTERNATIONAL FORUM ON RESEARCH AND TECHNOLOGIES FOR SOCIETY AND INDUSTRY LEVERAGING A BETTER TOMORROW (RTSI), 2016, : 387 - 391
  • [49] Generalization in fully-connected neural networks for time series forecasting
    Borovykh, Anastasia
    Oosterlee, Cornelis W.
    Bohte, Sander M.
    JOURNAL OF COMPUTATIONAL SCIENCE, 2019, 36
  • [50] AirFC: Designing Fully Connected Layers for Neural Networks with Wireless Signals
    Reus-Muns, Guillem
    Alemdar, Kubra
    Sanchez, Sara Garcia
    Roy, Debashri
    Chowdhury, Kaushik R.
    PROCEEDINGS OF THE 2023 INTERNATIONAL SYMPOSIUM ON THEORY, ALGORITHMIC FOUNDATIONS, AND PROTOCOL DESIGN FOR MOBILE NETWORKS AND MOBILE COMPUTING, MOBIHOC 2023, 2023, : 71 - 80