Mutual information, neural networks and the renormalization group

被引:1
|
作者
Maciej Koch-Janusz
Zohar Ringel
机构
[1] ETH Zurich,Institute for Theoretical Physics
[2] Hebrew University of Jerusalem,Racah Institute of Physics
来源
Nature Physics | 2018年 / 14卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains ‘slow’ degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.
引用
收藏
页码:578 / 582
页数:4
相关论文
共 50 条
  • [1] Mutual information, neural networks and the renormalization group
    Koch-Janusz, Maciej
    Ringel, Zohar
    [J]. NATURE PHYSICS, 2018, 14 (06) : 578 - 582
  • [2] Towards quantifying information flows: Relative entropy in deep neural networks and the renormalization group
    Erdmenger, Johanna
    Grosvenor, Kevin T.
    Jefferson, Ro
    [J]. SCIPOST PHYSICS, 2022, 12 (01):
  • [3] Mutual Information Maximization in Graph Neural Networks
    Di, Xinhan
    Yu, Pengqian
    Bu, Rui
    Sun, Mingchao
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [4] A multivariate extension of mutual information for growing neural networks
    Ball, Kenneth R.
    Grant, Christopher
    Mundy, William R.
    Shafer, Timothy J.
    [J]. NEURAL NETWORKS, 2017, 95 : 29 - 43
  • [5] Mutual Information-based RBM Neural Networks
    Peng, Kang-Hao
    Zhang, Heng
    [J]. 2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 2458 - 2463
  • [6] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [7] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [8] Lyapunov exponents and mutual information of chaotic neural networks
    Mizutani, S
    Sano, T
    Uchiyama, T
    Sonehara, N
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING VI, 1996, : 200 - 209
  • [9] Improved Neural Networks Based on Mutual Information via Information Geometry
    Wang, Meng
    Xiao, Chuang-Bai
    Ning, Zhen-Hu
    Yu, Jing
    Zhang, Ya-Hao
    Pang, Jin
    [J]. ALGORITHMS, 2019, 12 (05)
  • [10] GroupIM: A Mutual Information Maximization Framework for Neural Group Recommendation
    Sankar, Aravind
    Wu, Yanhong
    Wu, Yuhang
    Zhang, Wei
    Yang, Hao
    Sundaram, Hari
    [J]. PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 1279 - 1288