Convergence time analysis of Asynchronous Distributed Artificial Neural Networks

被引:0
|
作者
Tosi, Mauro D. L. [1 ]
Venugopal, Vinu Ellampallil [1 ]
Theobald, Martin [1 ]
机构
[1] Univ Luxembourg, Esch Sur Alzette, Luxembourg
关键词
D O I
10.1145/3493700.3493758
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial Neural Networks (ANNs) have drawn academy and industry attention for their ability to represent and solve complex problems. Researchers are studying how to distribute their computation to reduce their training time. However, the most common approaches in this direction are synchronous, letting computational resources sub-utilized. Asynchronous training does not have this drawback but is impacted by staled gradient updates, which have not been extended researched yet. Considering this, we experimentally investigate how stale gradients affect the convergence time and loss value of an ANN. In particular, we analyze an asynchronous distributed implementation of a Word2Vec model, in which the impact of staleness is negligible and can be ignored considering the computational speedup we achieve by allowing the staleness.
引用
收藏
页码:314 / 315
页数:2
相关论文
共 50 条
  • [21] Exponential p-convergence analysis for stochastic BAM neural networks with time-varying and infinite distributed delays
    Li, Liangliang
    Jian, Jigui
    APPLIED MATHEMATICS AND COMPUTATION, 2015, 266 : 860 - 873
  • [22] Output convergence analysis of continuous-time recurrent neural networks
    Liu, DR
    Hu, SQ
    PROCEEDINGS OF THE 2003 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL III: GENERAL & NONLINEAR CIRCUITS AND SYSTEMS, 2003, : 466 - 469
  • [23] Genetic Algorithms and Artificial Neural Networks in Microarray Data Analysis: a Distributed Approach
    Bevilacqua, Vitoantonio
    Mastronardi, Giuseppe
    Menolascina, Filippo
    Paradiso, Angelo
    Tommasi, Stefania
    ENGINEERING LETTERS, 2006, 13 (03)
  • [24] A note on "global exponential convergence analysis of Hopfield neural networks with continuously distributed delays"
    Chang Da-Wei
    COMMUNICATIONS IN THEORETICAL PHYSICS, 2007, 47 (06) : 1143 - 1144
  • [25] Autoregressive Convolutional Neural Networks for Asynchronous Time Series
    Binkowski, Mikolaj
    Marti, Gautier
    Donnat, Philippe
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [26] A Note on “Global Exponential Convergence Analysis of Hopfield Neural Networks with Continuously Distributed Delays”
    CHANG Da-Wei Faculty of Mathematics and Information Science
    CommunicationsinTheoreticalPhysics, 2007, 47 (06) : 1143 - 1144
  • [27] New convergence on inertial neural networks with time-varying delays and continuously distributed delays
    Cao, Qian
    Long, Xin
    AIMS MATHEMATICS, 2020, 5 (06): : 5955 - 5968
  • [28] CONVERGENCE TIME ON THE RS MODEL FOR NEURAL NETWORKS
    Penna, T. J. P.
    de Oliveira, P. M. C.
    Arenzon, J. J.
    de Almeida, R. M. C.
    Iglesias, J. R.
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 1991, 2 (03): : 711 - 717
  • [29] THE CHARACTERISTICS OF THE CONVERGENCE TIME OF ASSOCIATIVE NEURAL NETWORKS
    TANAKA, T
    YAMADA, M
    NEURAL COMPUTATION, 1993, 5 (03) : 463 - 472
  • [30] Characteristics of the convergence time of associative neural networks
    Tanaka, Toshiaki
    Yamada, Miki
    Neural Computation, 1993, 5 (03)