Efficient Self-learning Evolutionary Neural Architecture Search

被引:4
|
作者
Qiu, Zhengzhong [1 ]
Bi, Wei [2 ]
Xu, Dong [3 ,4 ]
Guo, Hua [2 ]
Ge, Hongwei [5 ]
Liang, Yanchun [6 ]
Lee, Heow Pueh [7 ]
Wu, Chunguo [1 ]
机构
[1] Jilin Univ, Minist Educ, Coll Comp Sci & Technol, Key Lab Symbol Computat & Knowledge Engn, Changchun 130012, Jilin, Peoples R China
[2] YGSOFT INC, Zhuhai 519085, Guangdong, Peoples R China
[3] Univ Missouri, Dept Elect Engn & Comp Sci, Columbia, MO 65211 USA
[4] Univ Missouri, Christopher S Bond Life Sci Ctr, Columbia, MO 65211 USA
[5] Dalian Univ Technol, Coll Comp Sci & Technol, Dalian 116081, Liaoning, Peoples R China
[6] Zhuhai Coll Sci & Technol, Sch Comp Sci, Zhuhai 519041, Guangdong, Peoples R China
[7] Natl Univ Singapore, Dept Mech Engn, 9 Engn Dr 1, Singapore 117575, Singapore
基金
中国国家自然科学基金;
关键词
Evolutionary algorithm; Neural architecture search; Probability distribution; Model size control; NETWORKS;
D O I
10.1016/j.asoc.2023.110671
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The evolutionary algorithm has become a major method for neural architecture search recently. However, the fixed probability distribution employed by the traditional evolutionary algorithm may lead to structural complexity and redundancy due to its inability to control the size of individual architectures, and it cannot learn from empirical information gathered during the search process to guide the subsequent search more effectively and efficiently. Moreover, evaluating the performance of all the searched architectures requires significant computing resources and time overhead. To overcome these challenges, we present the Efficient Self-learning Evolutionary Neural Architecture Search (ESE-NAS) method. Firstly, we propose an Adaptive Learning Strategy for Mutation Sampling, composed of a Model Size Control module and a Credit Assignment method for Mutation Candidates, to guide the search process by learning from the model size information and evaluation results of the architectures and adjusting the probability distributions for evolution sampling accordingly. Additionally, we developed a neural architecture performance predictor to further improve the efficiency of NAS. Experiments on CIFAR-10 and CIFAR-100 datasets show that ESE-NAS significantly brings forward the first hitting time of the optimal architectures and reaches a competitive performance level with classic manual-designed and NAS models while maintaining structural simplicity and efficiency.& COPY; 2023 Published by Elsevier B.V.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Self-learning for Received Signal Strength Map Reconstruction with Neural Architecture Search
    Malkova, Aleksandra
    Pauletto, Loic
    Villien, Christophe
    Denis, Benoit
    Amini, Massih-Reza
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 515 - 526
  • [2] Self-Supervised Representation Learning for Evolutionary Neural Architecture Search
    Wei, Chen
    Tang, Yiping
    Niu, Chuang Niu Chuang
    Hu, Haihong
    Wang, Yue
    Liang, Jimin
    [J]. IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2021, 16 (03) : 33 - 49
  • [3] Analysis of Evolutionary Behavior in Self-Learning Media Search Engines
    Kuang, Nikki Lijing
    Leung, Clement H. C.
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 643 - 650
  • [4] Efficient evolutionary neural architecture search based on hybrid search space
    Gong, Tao
    Ma, Yongjie
    Xu, Yang
    Song, Changwei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (08) : 3313 - 3326
  • [5] Random Search Algorithm with Self-Learning for Neural Network Training
    V. A. Kostenko
    L. E. Seleznev
    [J]. Optical Memory and Neural Networks, 2021, 30 : 180 - 186
  • [6] Random Search Algorithm with Self-Learning for Neural Network Training
    Kostenko, V. A.
    Seleznev, L. E.
    [J]. OPTICAL MEMORY AND NEURAL NETWORKS, 2021, 30 (02) : 180 - 186
  • [7] Research on Self-Learning Control Method of Reusable Launch Vehicle Based on Neural Network Architecture Search
    Xue, Shuai
    Wang, Zhaolei
    Bai, Hongyang
    Yu, Chunmei
    Li, Zian
    [J]. AEROSPACE, 2024, 11 (09)
  • [8] Guided evolutionary neural architecture search with efficient performance estimation
    Lopes, Vasco
    Santos, Miguel
    Degardin, Bruno
    Alexandre, Luis A.
    [J]. NEUROCOMPUTING, 2024, 584
  • [9] Efficient evolutionary neural architecture search by modular inheritable crossover
    He, Cheng
    Tan, Hao
    Huang, Shihua
    Cheng, Ran
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2021, 64
  • [10] CURIOUS: Efficient Neural Architecture Search Based on a Performance Predictor and Evolutionary Search
    Hassantabar, Shayan
    Dai, Xiaoliang
    Jha, Niraj K.
    [J]. IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2022, 41 (11) : 4975 - 4990