Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs

被引:2
|
作者
Khamesian, Saman [1 ]
Malek, Hamed [1 ]
机构
[1] Shahid Beheshti Univ SBU, Tehran, Iran
关键词
Neuro evolution; Neural architecture search; NEAT; Attari games; NEURAL-NETWORKS; NEUROEVOLUTION;
D O I
10.1007/s12530-023-09510-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm showed a significant result in different challenging tasks, as input representations are highly dimensional, it cannot create a well-tuned network. Accordingly, we decided to overcome this limitation by using the Self-Attention technique as an indirect encoding method to select the most important parts of the input. In order to tune the hyper-parameters of the self-attention module, we used the CMA-ES evolutionary algorithm. Also, an innovative method called Seesaw is presented in this article to evolve populations of the NEAT and CMA-ES algorithms simultaneously. Besides the evolutionary operators of the NEAT algorithm to update the weights, we used a combination method to reach more fitting weights. We tested our model on a variety of Atari games. The results showed that, compared to state-of-the-art evolutionary algorithms, Hybrid Self-Attention NEAT could eliminate the restriction of the original NEAT and achieve comparable scores with raw pixel input while using much smaller (e.g. approximately 300 x against HyperNEAT) number of parameters.
引用
下载
收藏
页码:489 / 503
页数:15
相关论文
共 50 条
  • [21] Self-Attention Generative Adversarial Networks
    Zhang, Han
    Goodfellow, Ian
    Metaxas, Dimitris
    Odena, Augustus
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [22] Research of Self-Attention in Image Segmentation
    Cao, Fude
    Zheng, Chunguang
    Huang, Limin
    Wang, Aihua
    Zhang, Jiong
    Zhou, Feng
    Ju, Haoxue
    Guo, Haitao
    Du, Yuxia
    JOURNAL OF INFORMATION TECHNOLOGY RESEARCH, 2022, 15 (01)
  • [23] Rethinking the Self-Attention in Vision Transformers
    Kim, Kyungmin
    Wu, Bichen
    Dai, Xiaoliang
    Zhang, Peizhao
    Yan, Zhicheng
    Vajda, Peter
    Kim, Seon
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3065 - 3069
  • [24] Relative molecule self-attention transformer
    Łukasz Maziarka
    Dawid Majchrowski
    Tomasz Danel
    Piotr Gaiński
    Jacek Tabor
    Igor Podolak
    Paweł Morkisz
    Stanisław Jastrzębski
    Journal of Cheminformatics, 16
  • [25] Self-Attention ConvLSTM for Spatiotemporal Prediction
    Lin, Zhihui
    Li, Maomao
    Zheng, Zhuobin
    Cheng, Yangyang
    Yuan, Chun
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 11531 - 11538
  • [26] Self-attention Hypergraph Pooling Network
    Zhao Y.-F.
    Jin F.-S.
    Li R.-H.
    Qin H.-C.
    Cui P.
    Wang G.-R.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (10):
  • [27] Pyramid Self-attention for Semantic Segmentation
    Qi, Jiyang
    Wang, Xinggang
    Hu, Yao
    Tang, Xu
    Liu, Wenyu
    PATTERN RECOGNITION AND COMPUTER VISION, PT I, 2021, 13019 : 480 - 492
  • [28] Self-Attention Technology in Image Segmentation
    Cao, Fude
    Lu, Xueyun
    INTERNATIONAL CONFERENCE ON INTELLIGENT TRAFFIC SYSTEMS AND SMART CITY (ITSSC 2021), 2022, 12165
  • [29] Self-Attention Based Video Summarization
    Li Y.
    Wang J.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2020, 32 (04): : 652 - 659
  • [30] How Does Selective Mechanism Improve Self-Attention Networks?
    Geng, Xinwei
    Wang, Longyue
    Wang, Xing
    Qin, Bing
    Liu, Ting
    Tu, Zhaopeng
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 2986 - 2995