Training Neural Networks is ∃R-complete

被引:0
|
作者
Abrahamsen, Mikkel [1 ]
Kleist, Linda [2 ]
Miltzow, Tillmann [3 ]
机构
[1] Univ Copenhagen, Copenhagen, Denmark
[2] Tech Univ Carolo Wilhelmina Braunschweig, Braunschweig, Germany
[3] Univ Utrecht, Utrecht, Netherlands
关键词
NODE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Given a neural network, training data, and a threshold, finding weights for the neural network such that the total error is below the threshold is known to be NP-hard. We determine the algorithmic complexity of this fundamental problem precisely, by showing that it is there exists R-complete. This means that the problem is equivalent, up to polynomial time reductions, to deciding whether a system of polynomial equations and inequalities with integer coefficients and real unknowns has a solution. If, as widely expected, there exists R is strictly larger than NP, our work implies that the problem of training neural networks is not even in NP. Neural networks are usually trained using some variation of backpropagation. The result of this paper offers an explanation why techniques commonly used to solve big instances of NP-complete problems seem not to be of use for this task. Examples of such techniques are SAT solvers, IP solvers, local search, dynamic programming, to name a few general ones.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Training of a class of recurrent neural networks
    Shaaban, EM
    ISCAS '98 - PROCEEDINGS OF THE 1998 INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-6, 1998, : B78 - B81
  • [32] Training Neural Networks with Policy Gradient
    Bose, Sourabh
    Huber, Manfred
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 3998 - 4005
  • [33] On the Computational Efficiency of Training Neural Networks
    Livni, Roi
    Shalev-Shwartz, Shai
    Shamir, Ohad
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [34] Lifted Bregman Training of Neural Networks
    Wang, Xiaoyu
    Benning, Martin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [35] Nonsmooth training of fuzzy neural networks
    Eitzinger, C
    SOFT COMPUTING, 2004, 8 (06) : 443 - 448
  • [36] MULTILINGUAL TRAINING OF DEEP NEURAL NETWORKS
    Ghoshal, Arnab
    Swietojanski, Pawel
    Renals, Steve
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 7319 - 7323
  • [37] INVESTIGATIONS ON SEQUENCE TRAINING OF NEURAL NETWORKS
    Wiesler, Simon
    Golik, Pavel
    Schlueter, Ralf
    Ney, Hermann
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 4565 - 4569
  • [38] A homotopy method for training neural networks
    Lendl, M
    Unbehauen, R
    Luo, FL
    SIGNAL PROCESSING, 1998, 64 (03) : 359 - 370
  • [39] Neural networks: Life after training
    Salerno, JJ
    1998 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5, 1998, : 1680 - 1685
  • [40] Applications of neural networks in training science
    Pfeiffer, Mark
    Hohmann, Andreas
    HUMAN MOVEMENT SCIENCE, 2012, 31 (02) : 344 - 359