Evolutionary approximation and neural architecture search

被引:0
|
作者
Michal Pinos
Vojtech Mrazek
Lukas Sekanina
机构
[1] Brno University of Technology,Faculty of Information Technology
关键词
Approximate computing; Convolutional neural network; Cartesian genetic programming; Neuroevolution; Energy efficiency;
D O I
暂无
中图分类号
学科分类号
摘要
Automated neural architecture search (NAS) methods are now employed to routinely deliver high-quality neural network architectures for various challenging data sets and reduce the designer’s effort. The NAS methods utilizing multi-objective evolutionary algorithms are especially useful when the objective is not only to minimize the network error but also to reduce the number of parameters (weights) or power consumption of the inference phase. We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN). The method allows approximate operations to be used in CNNs to reduce the power consumption of a target hardware implementation. During the NAS process, a suitable CNN architecture is evolved together with selecting approximate multipliers to deliver the best trade-offs between accuracy, network size, and power consumption. The most suitable 8 × N-bit approximate multipliers are automatically selected from a library of approximate multipliers. Evolved CNNs are compared with CNNs developed by other NAS methods on the CIFAR-10 and SVHN benchmark problems.
引用
收藏
页码:351 / 374
页数:23
相关论文
共 50 条
  • [31] EvoAAA: An evolutionary methodology for automated neural autoencoder architecture search
    Charte, Francisco
    Rivera, Antonio J.
    Martinez, Francisco
    del Jesus, Maria J.
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2020, 27 (03) : 211 - 231
  • [32] Knowledge-aware evolutionary graph neural architecture search
    Wang, Chao
    Zhao, Jiaxuan
    Li, Lingling
    Jiao, Licheng
    Liu, Fang
    Liu, Xu
    Yang, Shuyuan
    KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [33] PRE-NAS: Evolutionary Neural Architecture Search With Predictor
    Peng, Yameng
    Song, Andy
    Ciesielski, Vic
    Fayek, Haytham M. M.
    Chang, Xiaojun
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2023, 27 (01) : 26 - 36
  • [34] Real-Time Federated Evolutionary Neural Architecture Search
    Zhu, Hangyu
    Jin, Yaochu
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2022, 26 (02) : 364 - 378
  • [35] Efficient evolutionary neural architecture search by modular inheritable crossover
    He, Cheng
    Tan, Hao
    Huang, Shihua
    Cheng, Ran
    SWARM AND EVOLUTIONARY COMPUTATION, 2021, 64
  • [36] Score Predictor-Assisted Evolutionary Neural Architecture Search
    Jiang, Pengcheng
    Xue, Yu
    Neri, Ferrante
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2025,
  • [37] Efficient Self-learning Evolutionary Neural Architecture Search
    Qiu, Zhengzhong
    Bi, Wei
    Xu, Dong
    Guo, Hua
    Ge, Hongwei
    Liang, Yanchun
    Lee, Heow Pueh
    Wu, Chunguo
    APPLIED SOFT COMPUTING, 2023, 146
  • [38] Evolutionary Neural Architecture Search for Multivariate Time Series Forecasting
    Liang, Zixuan
    Sun, Yanan
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222
  • [39] Evolutionary neural architecture search for remaining useful life prediction
    Mo, Hyunho
    Custode, Leonardo Lucio
    Iacca, Giovanni
    APPLIED SOFT COMPUTING, 2021, 108
  • [40] Accelerating Evolutionary Neural Architecture Search via Multifidelity Evaluation
    Yang, Shangshang
    Tian, Ye
    Xiang, Xiaoshu
    Peng, Shichen
    Zhang, Xingyi
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (04) : 1778 - 1792