Evolutionary Recurrent Neural Architecture Search

被引:2
|
作者
Tian, Shuo [1 ]
Hu, Kai [1 ]
Guo, Shasha [1 ]
Li, Shiming [1 ]
Wang, Lei [1 ]
Xu, Weixia [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp Sci & Technol, Changsha 410000, Peoples R China
关键词
Computer architecture; Sociology; Statistics; Microprocessors; Manuals; Training; Computational modeling; Deep learning; evolution algorithm; neural architecture search (NAS); parameter sharing;
D O I
10.1109/LES.2020.3005753
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning has promoted remarkable progress in various tasks while the effort devoted to these hand-crafting neural networks has motivated so-called neural architecture search (NAS) to discover them automatically. Recent aging evolution (AE) automatic search algorithm turns to discard the oldest model in population and finds image classifiers beyond manual design. However, it achieves a low speed of convergence. A nonaging evolution (NAE) algorithm tends to neglect the worst architecture in population to accelerate the search process whereas it obtains a lower performance compared with AE. To address this issue, in this letter, we propose to use an optimized evolution algorithm for recurrent NAS (EvoRNAS) by setting a probability epsilon to remove the worst or oldest model in population alternatively, which can balance the performance and time length. Besides, parameter sharing mechanism is introduced in our approach due to the heavy cost of evaluating the candidate models in both AE and NAE. Furthermore, we train the sharing parameters only once instead of many epochs like ENAS, which makes the evaluation of candidate models faster. On Penn Treebank, we first explore different epsilon in EvoRNAS and find the best value suited for the learning task, which is also better than AE and NAE. Second, the optimal cell found by EvoRNAS can achieve state-of-the-art performance within only 0.6 GPU hours, which is 20 x and 40 x faster than ENAS and DARTS. Moreover, the transferability of the learned architecture to WikiText-2 also shows strong performance compared with ENAS or DARTS.
引用
收藏
页码:110 / 113
页数:4
相关论文
共 50 条
  • [41] Surrogate-assisted evolutionary neural architecture search with network embedding
    Fan, Liang
    Wang, Handing
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (03) : 3313 - 3331
  • [42] Knowledge transfer evolutionary search for lightweight neural architecture with dynamic inference
    Qian, Xiaoxue
    Liu, Fang
    Jiao, Licheng
    Zhang, Xiangrong
    Huang, Xinyan
    Li, Shuo
    Chen, Puhua
    Liu, Xu
    [J]. PATTERN RECOGNITION, 2023, 143
  • [43] EG-NAS: Neural Architecture Search with Fast Evolutionary Exploration
    Cai, Zicheng
    Chen, Lei
    Liu, Peng
    Ling, Tongtao
    Lai, Yutao
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 11159 - 11167
  • [44] Multi-population evolutionary neural architecture search with stacked generalization
    Song, Changwei
    Ma, Yongjie
    Xu, Yang
    Chen, Hong
    [J]. NEUROCOMPUTING, 2024, 587
  • [45] Evolutionary neural architecture search based on evaluation correction and functional units
    Shang, Ronghua
    Zhu, Songling
    Ren, Jinhong
    Liu, Hangcheng
    Jiao, Licheng
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [46] Knowledge reconstruction assisted evolutionary algorithm for neural network architecture search
    An, Yang
    Zhang, Changsheng
    Zheng, Xuanyu
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 264
  • [47] Fast Evolutionary Neural Architecture Search by Contrastive Predictor with Linear Regions
    Peng, Yameng
    Song, Andy
    Ciesielski, Vic
    Fayek, Haytham M.
    Chang, Xiaojun
    [J]. PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, GECCO 2023, 2023, : 1257 - 1266
  • [48] Evolutionary Neural Architecture Search for Automatic Esophageal Lesion Identification and Segmentation
    Zhou, Yao
    Yuan, Xianglei
    Zhang, Xiaozhi
    Liu, Wei
    Wu, Yu
    Yen, Gary G.
    Hu, Bing
    Yi, Zhang
    [J]. IEEE Transactions on Artificial Intelligence, 2022, 3 (03): : 436 - 450
  • [49] Genetic-GNN: Evolutionary architecture search for Graph Neural Networks
    Shi, Min
    Tang, Yufei
    Zhu, Xingquan
    Huang, Yu
    Wilson, David
    Zhuang, Yuan
    Liu, Jianxun
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 247
  • [50] Two-Stage Evolutionary Neural Architecture Search for Transfer Learning
    Wen, Yu-Wei
    Peng, Sheng-Hsuan
    Ting, Chuan-Kang
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2021, 25 (05) : 928 - 940