Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs

被引:2
|
作者
Khamesian, Saman [1 ]
Malek, Hamed [1 ]
机构
[1] Shahid Beheshti Univ SBU, Tehran, Iran
关键词
Neuro evolution; Neural architecture search; NEAT; Attari games; NEURAL-NETWORKS; NEUROEVOLUTION;
D O I
10.1007/s12530-023-09510-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm showed a significant result in different challenging tasks, as input representations are highly dimensional, it cannot create a well-tuned network. Accordingly, we decided to overcome this limitation by using the Self-Attention technique as an indirect encoding method to select the most important parts of the input. In order to tune the hyper-parameters of the self-attention module, we used the CMA-ES evolutionary algorithm. Also, an innovative method called Seesaw is presented in this article to evolve populations of the NEAT and CMA-ES algorithms simultaneously. Besides the evolutionary operators of the NEAT algorithm to update the weights, we used a combination method to reach more fitting weights. We tested our model on a variety of Atari games. The results showed that, compared to state-of-the-art evolutionary algorithms, Hybrid Self-Attention NEAT could eliminate the restriction of the original NEAT and achieve comparable scores with raw pixel input while using much smaller (e.g. approximately 300 x against HyperNEAT) number of parameters.
引用
收藏
页码:489 / 503
页数:15
相关论文
共 50 条
  • [1] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Saman Khamesian
    Hamed Malek
    [J]. Evolving Systems, 2024, 15 : 489 - 503
  • [2] Improve Image Captioning by Self-attention
    Li, Zhenru
    Li, Yaoyi
    Lu, Hongtao
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 91 - 98
  • [3] SHYNESS AND SELF-ATTENTION
    CROZIER, WR
    [J]. BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
  • [4] Attention and self-attention in random forests
    Utkin, Lev V.
    Konstantinov, Andrei V.
    Kirpichenko, Stanislav R.
    [J]. PROGRESS IN ARTIFICIAL INTELLIGENCE, 2023, 12 (03) : 257 - 273
  • [5] Attention and self-attention in random forests
    Lev V. Utkin
    Andrei V. Konstantinov
    Stanislav R. Kirpichenko
    [J]. Progress in Artificial Intelligence, 2023, 12 : 257 - 273
  • [6] Object Tracking Algorithm with Sparse Self-Attention
    Wang, Jindong
    Zhang, Jinglei
    Wen, Biao
    [J]. Computer Engineering and Applications, 2023, 59 (22) : 174 - 181
  • [7] A novel self-attention deep subspace clustering
    Chen, Zhengfan
    Ding, Shifei
    Hou, Haiwei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (08) : 2377 - 2387
  • [8] A novel self-attention deep subspace clustering
    Zhengfan Chen
    Shifei Ding
    Haiwei Hou
    [J]. International Journal of Machine Learning and Cybernetics, 2021, 12 : 2377 - 2387
  • [9] Self-Attention for Cyberbullying Detection
    Pradhan, Ankit
    Yatam, Venu Madhav
    Bera, Padmalochan
    [J]. 2020 INTERNATIONAL CONFERENCE ON CYBER SITUATIONAL AWARENESS, DATA ANALYTICS AND ASSESSMENT (CYBER SA 2020), 2020,
  • [10] On the Integration of Self-Attention and Convolution
    Pan, Xuran
    Ge, Chunjiang
    Lu, Rui
    Song, Shiji
    Chen, Guanfu
    Huang, Zeyi
    Huang, Gao
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 805 - 815