The Entropy Economy and the Kolmogorov Learning Cycle: Leveraging the intersection of Machine Learning and Algorithmic Information Theory to jointly optimize energy and learning

被引:0
|
作者
Evans, Scott C. [1 ]
Shah, Tapan [1 ]
Huang, Hao [1 ]
Ekanayake, Sachini Piyoni [1 ]
机构
[1] GE Vernova Adv Res Ctr, Niskayuna, NY 12309 USA
关键词
Kolmogorov complexity; Kolmogorov structure function; Machine learning; Entropy Economy; Algorithmic Information Theory;
D O I
10.1016/j.physd.2024.134051
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We augment the Kolmogorov Structure Function with energy cost and drive the concept of "Additive AI" where Machine Learning Models are created by traversing the Kolmogorov Structure function from low model complexity to high while seeking models achieving the Kolmogorov Minimum Sufficient Statistic with least energy cost. In this way, the intersection of Algorithmic Information Theory (AIT) with Machine Learning (ML) can enable optimization of the "Entropy Economy," where the precious resource of entropy flow is managed to jointly optimize computation, energy, and learning. In this paper we lay out the Kolmogorov Learning Cycle as a framework for this joint optimization and demonstrate the energy efficient machine learning algorithm Least Energy Usage Network (LEAN) as an example of how restraining complexity can reduce learning energy cost while maintaining performance. We motivate further directions for optimizing how AI models can be optimally learned and discuss additional opportunities to optimize where and when AI and machine learning models can be created to maximize learning while minimizing energy (and subsequently carbon costs) through the intersection of AIT and ML.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Bridging Algorithmic Information Theory and Machine Learning: A new approach to kernel learning
    Hamzi, Boumediene
    Hutter, Marcus
    Owhadi, Houman
    [J]. PHYSICA D-NONLINEAR PHENOMENA, 2024, 464
  • [2] Leveraging Theory for Enhanced Machine Learning
    Audus, Debra J.
    McDannald, Austin
    DeCost, Brian
    [J]. ACS MACRO LETTERS, 2022, : 1117 - 1122
  • [4] Energy use prediction with information theory and machine learning technique
    Tong, Y. W.
    Yang, W. Y.
    Zhan, D. L.
    [J]. 2019 3RD INTERNATIONAL CONFERENCE ON ENERGY AND ENVIRONMENTAL SCIENCE, 2019, 291
  • [5] Information geometry and information theory in machine learning
    Ikeda, Kazushi
    Iwata, Kazunori
    [J]. NEURAL INFORMATION PROCESSING, PART II, 2008, 4985 : 295 - +
  • [6] Machine Learning and Graph Theory to Optimize Drinking Water
    Amali, Said
    EL Faddouli, Nour-eddine
    Boutoulout, Ali
    [J]. PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING IN DATA SCIENCES (ICDS2017), 2018, 127 : 310 - 319
  • [7] Machine learning and social theory: Collective machine behaviour in algorithmic trading
    Borch, Christian
    [J]. EUROPEAN JOURNAL OF SOCIAL THEORY, 2022, 25 (04) : 503 - 520
  • [8] An information theoretic approach to reducing algorithmic bias for machine learning
    Kim, Jin-Young
    Cho, Sung-Bae
    [J]. NEUROCOMPUTING, 2022, 500 : 26 - 38
  • [9] Information Theory and Its Relation to Machine Learning
    Hu, Bao-Gang
    [J]. PROCEEDINGS OF THE 2015 CHINESE INTELLIGENT AUTOMATION CONFERENCE: INTELLIGENT INFORMATION PROCESSING, 2015, 336 : 1 - 11
  • [10] Learning Entropy: On Shannon vs. Machine-Learning-Based Information in Time Series
    Bukovsky, Ivo
    Budik, Ondrej
    [J]. DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2022 WORKSHOPS, 2022, 1633 : 402 - 415