Parallel Implementation of Feedforward Neural Networks on GPUs

被引:0
|
作者
Gurgel, Saskya T. A. [1 ]
Formiga, Andrei de A. [1 ]
机构
[1] Univ Fed Paraiba, Ctr Informat, BR-58059900 Joao Pessoa, Paraiba, Brazil
关键词
neural networks; parallel; GPUs;
D O I
10.1109/BRACIS.2013.32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks are often seen as a natural model of parallel computation, especially when contrasted with more traditional sequential models like the Turing Machine. The parallelism of neural networks has become more important in recent years through the confluence of two tendencies in the evolution of computer and information technologies: first, parallel computing devices are now ubiquitous, instead of being relegated to a niche market; and second, the amount of data available to analyze and learn from in machine learning applications has increased at a rapid pace. Graphical Processing Units (GPUs) provide great computational power in standard desktop computers, being composed of many simple execution units. In this paper a technique is presented for the parallel implementation of neural networks in GPUs. The technique is explained in relation to the difficulties imposed by the execution model of GPUs. Experimental results indicate that the proposed implementation techniques can easily attain a performance gain of more than one order of magnitude, and are scalable with the processing power of the GPU used.
引用
收藏
页码:143 / 149
页数:7
相关论文
共 50 条
  • [41] Channel equalization by feedforward neural networks
    Lu, B
    Evans, BL
    ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 587 - 590
  • [42] Interpolation functions of feedforward neural networks
    Li, HX
    Lee, ES
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2003, 46 (12) : 1861 - 1874
  • [43] Global Optimization of Feedforward Neural Networks
    LIANG Xun XIA Shaowei Department of Automation
    Journal of Systems Science and Systems Engineering, 1993, (03) : 273 - 280
  • [44] Injecting Chaos in Feedforward Neural Networks
    Sultan Uddin Ahmed
    Md. Shahjahan
    Kazuyuki Murase
    Neural Processing Letters, 2011, 34 : 87 - 100
  • [45] A New Formulation for Feedforward Neural Networks
    Razavi, Saman
    Tolson, Bryan A.
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (10): : 1588 - 1598
  • [46] A NEW MODEL OF FEEDFORWARD NEURAL NETWORKS
    WANG, DX
    TAI, JW
    PHYSICS LETTERS A, 1992, 162 (01) : 41 - 44
  • [47] Interpolation representation of feedforward neural networks
    Li, HX
    Li, LX
    Wang, JY
    MATHEMATICAL AND COMPUTER MODELLING, 2003, 37 (7-8) : 829 - 847
  • [48] Quantum generalisation of feedforward neural networks
    Wan, Kwok Ho
    Dahlsten, Oscar
    Kristjansson, Hler
    Gardner, Robert
    Kim, M. S.
    NPJ QUANTUM INFORMATION, 2017, 3
  • [49] Injecting Chaos in Feedforward Neural Networks
    Ahmed, Sultan Uddin
    Shahjahan, Md.
    Murase, Kazuyuki
    NEURAL PROCESSING LETTERS, 2011, 34 (01) : 87 - 100
  • [50] Topology of Learning in Feedforward Neural Networks
    Gabella, Maxime
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (08) : 3588 - 3592