Learning Generative Models for Active Inference Using Tensor Networks

被引:1
|
作者
Wauthier, Samuel T. [1 ]
Vanhecke, Bram [2 ,3 ]
Verbelen, Tim [1 ]
Dhoedt, Bart [1 ]
机构
[1] Ghent Univ IMEC, IDLab, Dept Informat Technol, Technol Pk Zwijnaarde 126, B-9052 Ghent, Belgium
[2] Univ Vienna, Fac Phys, Boltzmanngasse 5, A-1090 Vienna, Austria
[3] Univ Vienna, Fac Math Quantum Opt Quantum Nanophys & Quantum I, Boltzmanngasse 5, A-1090 Vienna, Austria
来源
ACTIVE INFERENCE, IWAI 2022 | 2023年 / 1721卷
关键词
Active inference; Tensor networks; Generative modeling;
D O I
10.1007/978-3-031-28719-0_20
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active inference provides a general framework for behavior and learning in autonomous agents. It states that an agent will attempt to minimize its variational free energy, defined in terms of beliefs over observations, internal states and policies. Traditionally, every aspect of a discrete active inference model must be specified by hand, i.e. by manually defining the hidden state space structure, as well as the required distributions such as likelihood and transition probabilities. Recently, efforts have been made to learn state space representations automatically from observations using deep neural networks. In this paper, we present a novel approach of learning state spaces using quantum physics-inspired tensor networks. The ability of tensor networks to represent the probabilistic nature of quantum states as well as to reduce large state spaces makes tensor networks a natural candidate for active inference. We show how tensor networks can be used as a generative model for sequential data. Furthermore, we show how one can obtain beliefs from such a generative model and how an active inference agent can use these to compute the expected free energy. Finally, we demonstrate our method on the classic T-maze environment.
引用
收藏
页码:285 / 297
页数:13
相关论文
共 50 条
  • [1] Learning Generative State Space Models for Active Inference
    Catal, Ozan
    Wauthier, Samuel
    De Boom, Cedric
    Verbelen, Tim
    Dhoedt, Bart
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14
  • [2] Inference and Learning for Generative Capsule Models
    Nazabal, Alfredo
    Tsagkas, Nikolaos
    Williams, Christopher K. I.
    NEURAL COMPUTATION, 2023, 35 (04) : 727 - 761
  • [3] Generative models, linguistic communication and active inference
    Friston, Karl J.
    Parr, Thomas
    Yufik, Yan
    Sajid, Noor
    Price, Catherine J.
    Holmes, Emma
    NEUROSCIENCE AND BIOBEHAVIORAL REVIEWS, 2020, 118 : 42 - 64
  • [4] Generative models for sequential dynamics in active inference
    Parr, Thomas
    Friston, Karl
    Pezzulo, Giovanni
    COGNITIVE NEURODYNAMICS, 2023, 18 (6) : 3259 - 3272
  • [5] Planning with tensor networks based on active inference
    Wauthier, Samuel T.
    Verbelen, Tim
    Dhoedt, Bart
    Vanhecke, Bram
    Machine Learning: Science and Technology, 2024, 5 (04):
  • [6] Learning and inference using complex generative models in a spatial localization task
    Bejjanki, Vikranth R.
    Knill, David C.
    Aslin, Richard N.
    JOURNAL OF VISION, 2016, 16 (05):
  • [7] Flexible and accurate inference and learning for deep generative models
    Vertes, Eszter
    Sahani, Maneesh
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] Phylogenetic inference using generative adversarial networks
    Smith, Megan L.
    Hahn, Matthew W.
    BIOINFORMATICS, 2023, 39 (09)
  • [9] Dequantizing quantum machine learning models using tensor networks
    Shin, Seongwook
    Teo, Yong Siah
    Jeong, Hyunseok
    PHYSICAL REVIEW RESEARCH, 2024, 6 (02):
  • [10] PID Control as a Process of Active Inference with Linear Generative Models †
    Baltieri, Manuel
    Buckley, Christopher L.
    ENTROPY, 2019, 21 (03)