PAC-learning gains of Turing machines over circuits and neural networks

被引:1
|
作者
Pinon, Brieuc [1 ,2 ]
Jungers, Raphael [1 ]
Delvenne, Jean-Charles [1 ]
机构
[1] UCLouvain, ICTEAM, INMA, Louvain la Neuve, Belgium
[2] Ave Georges Lemaitre 4 6 L4 05 01, B-1348 Louvain la Neuve, Belgium
基金
欧洲研究理事会;
关键词
Kolmogorov complexity; Minimum description length; PAC-learning; Computational complexity; Deep learning; Program induction; FORMAL THEORY; COMPLEXITY; BOUNDS; SIZE;
D O I
10.1016/j.physd.2022.133585
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A caveat to many applications of the current Deep Learning approach is the need for large-scale data. One improvement suggested by Kolmogorov Complexity results is to apply the minimum description length principle with computationally universal models. We study the potential gains in sample efficiency that this approach can bring in principle. We use polynomial-time Turing machines to represent computationally universal models and Boolean circuits to represent Artificial Neural Networks (ANNs) acting on finite-precision digits.Our analysis unravels direct links between our question and Computational Complexity results. We provide lower and upper bounds on the potential gains in sample efficiency between the MDL applied with Turing machines instead of ANNs. Our bounds depend on the bit-size of the input of the Boolean function to be learned. Furthermore, we highlight close relationships between classical open problems in Circuit Complexity and the tightness of these bounds.(c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Thermal Neural Networks for Electric Vehicle Traction Machines: A Transfer Learning Approach
    Wiese, Niels
    Gronwald, Peer-Ole
    Etzold, Konstantin
    Henke, Markus
    2024 27TH INTERNATIONAL SYMPOSIUM ON POWER ELECTRONICS, ELECTRICAL DRIVES, AUTOMATION AND MOTION, SPEEDAM 2024, 2024, : 369 - 374
  • [22] PHOTONIC NEURAL NETWORKS AND LEARNING MACHINES - THE ROLE OF ELECTRON-TRAPPING MATERIALS
    FARHAT, NH
    IEEE EXPERT-INTELLIGENT SYSTEMS & THEIR APPLICATIONS, 1992, 7 (05): : 63 - 71
  • [23] Mobility prediction in mobile ad hoc networks using neural learning machines
    Ghouti, Lahouari
    SIMULATION MODELLING PRACTICE AND THEORY, 2016, 66 : 104 - 121
  • [24] Defect-Tolerant Memristor Crossbar Circuits for Local Learning Neural Networks
    Oh, Seokjin
    Yoon, Rina
    Min, Kyeong-Sik
    NANOMATERIALS, 2025, 15 (03)
  • [25] Neural Networks Learning and Memorization with (almost) no Over-Parameterization
    Daniely, Amit
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [26] GenSynth: a generative synthesis approach to learning generative machines for generate efficient neural networks
    Wong, Alexander
    Shafiee, Mohammad Javad
    Chwyl, Brendan
    Li, Francis
    ELECTRONICS LETTERS, 2019, 55 (18) : 986 - 988
  • [27] Machines learning physics: Deep tensor neural networks for dynamically optimized effective Hamiltonians
    Nebgen, Benjamin
    Lubbers, Nicholas
    Tretiak, Sergei
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 256
  • [28] Integration of deep neural networks and ensemble learning machines for missing well logs estimation
    Han Jian
    Lu Chenghui
    Cao Zhimin
    Mu Haiwei
    FLOW MEASUREMENT AND INSTRUMENTATION, 2020, 73
  • [29] FEASIBILITY OF USING UNSUPERVISED LEARNING, ARTIFICIAL NEURAL NETWORKS FOR THE CONDITION MONITORING OF ELECTRICAL MACHINES
    PENMAN, J
    YIN, CM
    IEE PROCEEDINGS-ELECTRIC POWER APPLICATIONS, 1994, 141 (06): : 317 - 322
  • [30] Development of learning expert systems for early diagnostics of turbine machines on the basis of neural networks
    Dimapogonas, E.D.
    Teploenergetika, 1993, (10): : 68 - 71