The Entropy Economy and the Kolmogorov Learning Cycle: Leveraging the intersection of Machine Learning and Algorithmic Information Theory to jointly optimize energy and learning

被引:0
|
作者
Evans, Scott C. [1 ]
Shah, Tapan [1 ]
Huang, Hao [1 ]
Ekanayake, Sachini Piyoni [1 ]
机构
[1] GE Vernova Adv Res Ctr, Niskayuna, NY 12309 USA
关键词
Kolmogorov complexity; Kolmogorov structure function; Machine learning; Entropy Economy; Algorithmic Information Theory;
D O I
10.1016/j.physd.2024.134051
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We augment the Kolmogorov Structure Function with energy cost and drive the concept of "Additive AI" where Machine Learning Models are created by traversing the Kolmogorov Structure function from low model complexity to high while seeking models achieving the Kolmogorov Minimum Sufficient Statistic with least energy cost. In this way, the intersection of Algorithmic Information Theory (AIT) with Machine Learning (ML) can enable optimization of the "Entropy Economy," where the precious resource of entropy flow is managed to jointly optimize computation, energy, and learning. In this paper we lay out the Kolmogorov Learning Cycle as a framework for this joint optimization and demonstrate the energy efficient machine learning algorithm Least Energy Usage Network (LEAN) as an example of how restraining complexity can reduce learning energy cost while maintaining performance. We motivate further directions for optimizing how AI models can be optimally learned and discuss additional opportunities to optimize where and when AI and machine learning models can be created to maximize learning while minimizing energy (and subsequently carbon costs) through the intersection of AIT and ML.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Augmenting machine learning of energy landscapes with local structural information
    Honrao, Shreyas J.
    Xie, Stephen R.
    Hennig, Richard G.
    [J]. JOURNAL OF APPLIED PHYSICS, 2020, 128 (08)
  • [22] Machine learning methods for leveraging baseline covariate information to improve the efficiency of clinical trials
    Zhang, Zhiwei
    Ma, Shujie
    [J]. STATISTICS IN MEDICINE, 2019, 38 (10) : 1703 - 1714
  • [23] Probability density and information entropy of machine learning derived intracranial pressure predictions
    Abdul-Rahman, Anmar
    Morgan, William
    Vukmirovic, Aleksandar
    Yu, Dao-Yi
    [J]. PLOS ONE, 2024, 19 (07):
  • [24] Using Machine Learning to Estimate Pedestrian and Bicyclist Count of Intersection by Bluetooth Low Energy
    Gong, Yaobang
    Abdel-Aty, Mohamed
    [J]. JOURNAL OF TRANSPORTATION ENGINEERING PART A-SYSTEMS, 2022, 148 (01)
  • [25] Predictive Modeling of Stock Indexes Using Machine Learning and Information Theory
    Li Xingzhou
    Ren Hong
    Zhong Yujun
    [J]. ICEME 2019: 019 10TH INTERNATIONAL CONFERENCE ON E-BUSINESS, MANAGEMENT AND ECONOMICS, 2019, : 175 - 179
  • [26] A Novel Paradigm of Melanoma Diagnosis Using Machine Learning and Information Theory
    Giri, Kailash Chandra
    Patel, Mayank
    Sinhal, Amit
    Gautam, Diwakar
    [J]. PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING & COMMUNICATION ENGINEERING (ICACCE-2019), 2019,
  • [27] Semantic Information G Theory and Logical Bayesian Inference for Machine Learning
    Lu, Chenguang
    [J]. INFORMATION, 2019, 10 (08)
  • [28] Machine learning of carbon vacancy formation energy in high-entropy carbides
    Zhao, Xi
    Yu, Sen
    Zheng, Jiming
    Reece, Michael J.
    Zhang, Rui-Zhi
    [J]. JOURNAL OF THE EUROPEAN CERAMIC SOCIETY, 2023, 43 (04) : 1315 - 1321
  • [29] Energy-entropy competition and the effectiveness of stochastic gradient descent in machine learning
    Zhang, Yao
    Saxe, Andrew M.
    Advani, Madhu S.
    Lee, Alpha A.
    [J]. MOLECULAR PHYSICS, 2018, 116 (21-22) : 3214 - 3223
  • [30] Multi-Instance Learning using Information Entropy Theory for Image Retrieval
    Li Jun-yi
    Li Jian-hua
    Yan Shui-cheng
    [J]. 2014 IEEE 17TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE), 2014, : 1727 - 1733