PAC-learning gains of Turing machines over circuits and neural networks

被引:1
|
作者
Pinon, Brieuc [1 ,2 ]
Jungers, Raphael [1 ]
Delvenne, Jean-Charles [1 ]
机构
[1] UCLouvain, ICTEAM, INMA, Louvain la Neuve, Belgium
[2] Ave Georges Lemaitre 4 6 L4 05 01, B-1348 Louvain la Neuve, Belgium
基金
欧洲研究理事会;
关键词
Kolmogorov complexity; Minimum description length; PAC-learning; Computational complexity; Deep learning; Program induction; FORMAL THEORY; COMPLEXITY; BOUNDS; SIZE;
D O I
10.1016/j.physd.2022.133585
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A caveat to many applications of the current Deep Learning approach is the need for large-scale data. One improvement suggested by Kolmogorov Complexity results is to apply the minimum description length principle with computationally universal models. We study the potential gains in sample efficiency that this approach can bring in principle. We use polynomial-time Turing machines to represent computationally universal models and Boolean circuits to represent Artificial Neural Networks (ANNs) acting on finite-precision digits.Our analysis unravels direct links between our question and Computational Complexity results. We provide lower and upper bounds on the potential gains in sample efficiency between the MDL applied with Turing machines instead of ANNs. Our bounds depend on the bit-size of the input of the Boolean function to be learned. Furthermore, we highlight close relationships between classical open problems in Circuit Complexity and the tightness of these bounds.(c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] CHEMICAL IMPLEMENTATION OF NEURAL NETWORKS AND TURING-MACHINES
    HJELMFELT, A
    WEINBERGER, ED
    ROSS, J
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 1991, 88 (24) : 10983 - 10987
  • [2] Evolving Neural Turing Machines for Reward-based Learning
    Greve, Rasmus Boll
    Jacobsen, Emil Juul
    Risi, Sebastian
    GECCO'16: PROCEEDINGS OF THE 2016 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2016, : 117 - 124
  • [3] Agent-environment approach to the simulation of Turing Machines by Neural Networks
    de Oliveira, WR
    de Souto, MCP
    Ludermir, TB
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 71 - 76
  • [4] Neural Networks That Mimic the Human Brain: Turing Machines versus Machines That Generate Conscious Sensations
    Rosen, Alan
    Rosen, David B.
    ADVANCES IN NEURO-INFORMATION PROCESSING, PT I, 2009, 5506 : 794 - 801
  • [5] OPTOELECTRONIC NEURAL NETWORKS AND LEARNING MACHINES
    FARHAT, NH
    IEEE CIRCUITS & DEVICES, 1989, 5 (05): : 32 - 41
  • [7] Neural networks learning quantum chemistry: The rise of the machines
    Smith, Justin
    Isayev, Olexandr
    Roitberg, Adrian
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2017, 254
  • [8] Neural networks learning quantum chemistry: The rise of the machines
    Roitberg, Adrian
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2019, 257
  • [9] Convolutional neural networks and extreme learning machines for malware classification
    Mugdha Jain
    William Andreopoulos
    Mark Stamp
    Journal of Computer Virology and Hacking Techniques, 2020, 16 : 229 - 244
  • [10] Convolutional neural networks and extreme learning machines for malware classification
    Jain, Mugdha
    Andreopoulos, William
    Stamp, Mark
    JOURNAL OF COMPUTER VIROLOGY AND HACKING TECHNIQUES, 2020, 16 (03) : 229 - 244