Time complexity analysis of neural networks on message passing multicomputer systems

被引:0
|
作者
Tan, RS [1 ]
Narasimhan, VL [1 ]
机构
[1] Def Sci & Technol Org, Div Informat Technol, Informat Management Grp, Salisbury, SA 5108, Australia
关键词
time complexity analysis; multicomputer systems; neural networks; message passing;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the main disadvantages of Artificial Neural Networks (ANN) is the long time periods required for training which is due to the large numbers of interconnections between the neurons required for many real-world problems. ANNs are amenable for parallel implementations due to their inherent parallelism. In this paper, we propose some time-cost complexity models for the training parallelism and algorithmic parallelism approaches to implementing backpropagation on a multilayer perceptron ANN architecture. The models are compared with empirical results obtained from a network of T-800 transputers and are found to be accurate enough to be used to predict the performance of the algorithms for different problem sizes and for different message passing multicomputer systems. From these models we are then able to theoretically determine the size of the multicomputer required for a certain problem size so as to obtain the best performance.
引用
收藏
页码:137 / 144
页数:8
相关论文
共 50 条
  • [1] ON COMPLEXITY OF A MESSAGE-ROUTING STRATEGY FOR MULTICOMPUTER SYSTEMS
    CHOI, HA
    ESFAHANIAN, AH
    [J]. LECTURE NOTES IN COMPUTER SCIENCE, 1991, 484 : 170 - 181
  • [2] Mapping of artificial neural networks onto message passing systems
    Kumar, MJ
    Patnaik, LM
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1996, 26 (06): : 822 - 835
  • [3] Retention Time Prediction with Message-Passing Neural Networks
    Osipenko, Sergey
    Nikolaev, Eugene
    Kostyukevich, Yury
    [J]. SEPARATIONS, 2022, 9 (10)
  • [4] Implementation of parallel-in-time Newton method for transient stability analysis on a message passing multicomputer
    Chao, H
    [J]. POWERCON 2002: INTERNATIONAL CONFERENCE ON POWER SYSTEM TECHNOLOGY, VOLS 1-4, PROCEEDINGS, 2002, : 1239 - 1243
  • [5] Message Passing Neural Networks for Hypergraphs
    Heydari, Sajjad
    Livi, Lorenzo
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II, 2022, 13530 : 583 - 592
  • [6] Analysis of Complexity for the Message Passing Algorithm
    Maguolo, Federico
    Mior, Alessandra
    [J]. 2008 INTERNATIONAL CONFERENCE ON SOFTWARE, TELECOMMUNICATIONS AND COMPUTER NETWORKS, 2008, : 294 - 298
  • [7] ANALYSIS OF PIPELINED EXTERNAL SORTING ON A RECONFIGURABLE MESSAGE-PASSING MULTICOMPUTER
    MENEZES, BL
    RICARTE, ILM
    THURIMELLA, R
    [J]. PARALLEL COMPUTING, 1993, 19 (08) : 839 - 858
  • [8] Implementation of parallel algorithms for transient stability analysis on a message passing multicomputer
    Hong, C
    Shen, CM
    [J]. 2000 IEEE POWER ENGINEERING SOCIETY WINTER MEETING - VOLS 1-4, CONFERENCE PROCEEDINGS, 2000, : 1410 - 1415
  • [9] On the complexity of buffer allocation in message passing systems
    Brodsky, A
    Pedersen, JB
    Wagner, A
    [J]. JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2005, 65 (06) : 692 - 713
  • [10] Pathfinder Discovery Networks for Neural Message Passing
    Rozemberczki, Benedek
    Englert, Peter
    Kapoor, Amol
    Blais, Martin
    Perozzi, Bryan
    [J]. PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 2547 - 2558