Entropy and Mutual Information can Improve Fitness Evaluation in Coevolution of Neural Networks

被引:0
|
作者
Hoverstad, Boye Annfelt [1 ]
Moe, Haaken A. [1 ]
Shi, Min [1 ]
机构
[1] Norwegian Univ Sci & Technol NTNU, Dept Comp & Informat Sci, Complex Adapt Organically Inspired Syst Grp, Trondheim, Norway
关键词
COMPLEXITY;
D O I
10.1109/CEC.2009.4983349
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate fitness estimates are notoriously difficult to attain in cooperative coevolution, as it is often unclear how to reward the individual parts given an evaluation of the evolved system as a whole. This is particularly true for cooperative approaches to neuroevolution, where neurons or neuronal groups are highly interdependent. In this paper we investigate this problem in the context of evolving neural networks for unstable control problems. We use measures from information theory and neuroscience to reward neurons in a neural network based on their degree of participation in the behavior of the network as a whole. In particular, we actively seek networks with high complexity and little redundancy, and argue that this can lead to efficient evolution of robust controllers. Preliminary results support this claim, and indicate that measures from information theory may provide meaningful information about the role of each neuron in a network.
引用
收藏
页码:3199 / 3206
页数:8
相关论文
共 50 条
  • [1] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [2] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [3] Entropy, mutual information, and systematic measures of structured spiking neural networks
    Li, Wenjie
    Li, Yao
    [J]. JOURNAL OF THEORETICAL BIOLOGY, 2020, 501
  • [4] Mutual Information Maximization in Graph Neural Networks
    Di, Xinhan
    Yu, Pengqian
    Bu, Rui
    Sun, Mingchao
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [5] Mutual information, neural networks and the renormalization group
    Koch-Janusz, Maciej
    Ringel, Zohar
    [J]. NATURE PHYSICS, 2018, 14 (06) : 578 - 582
  • [6] Mutual information, neural networks and the renormalization group
    Maciej Koch-Janusz
    Zohar Ringel
    [J]. Nature Physics, 2018, 14 : 578 - 582
  • [7] A multivariate extension of mutual information for growing neural networks
    Ball, Kenneth R.
    Grant, Christopher
    Mundy, William R.
    Shafer, Timothy J.
    [J]. NEURAL NETWORKS, 2017, 95 : 29 - 43
  • [8] Mutual Information-based RBM Neural Networks
    Peng, Kang-Hao
    Zhang, Heng
    [J]. 2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 2458 - 2463
  • [9] Lyapunov exponents and mutual information of chaotic neural networks
    Mizutani, S
    Sano, T
    Uchiyama, T
    Sonehara, N
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING VI, 1996, : 200 - 209
  • [10] Improved Neural Networks Based on Mutual Information via Information Geometry
    Wang, Meng
    Xiao, Chuang-Bai
    Ning, Zhen-Hu
    Yu, Jing
    Zhang, Ya-Hao
    Pang, Jin
    [J]. ALGORITHMS, 2019, 12 (05)