Evolutionary approximation and neural architecture search

被引:0
|
作者
Michal Pinos
Vojtech Mrazek
Lukas Sekanina
机构
[1] Brno University of Technology,Faculty of Information Technology
关键词
Approximate computing; Convolutional neural network; Cartesian genetic programming; Neuroevolution; Energy efficiency;
D O I
暂无
中图分类号
学科分类号
摘要
Automated neural architecture search (NAS) methods are now employed to routinely deliver high-quality neural network architectures for various challenging data sets and reduce the designer’s effort. The NAS methods utilizing multi-objective evolutionary algorithms are especially useful when the objective is not only to minimize the network error but also to reduce the number of parameters (weights) or power consumption of the inference phase. We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN). The method allows approximate operations to be used in CNNs to reduce the power consumption of a target hardware implementation. During the NAS process, a suitable CNN architecture is evolved together with selecting approximate multipliers to deliver the best trade-offs between accuracy, network size, and power consumption. The most suitable 8 × N-bit approximate multipliers are automatically selected from a library of approximate multipliers. Evolved CNNs are compared with CNNs developed by other NAS methods on the CIFAR-10 and SVHN benchmark problems.
引用
收藏
页码:351 / 374
页数:23
相关论文
共 50 条
  • [41] Multi-objective Evolutionary Neural Architecture Search for Recurrent Neural Networks
    Booysen, Reinhard
    Bosman, Anna Sergeevna
    NEURAL PROCESSING LETTERS, 2024, 56 (04)
  • [42] Knowledge transfer evolutionary search for lightweight neural architecture with dynamic inference
    Qian, Xiaoxue
    Liu, Fang
    Jiao, Licheng
    Zhang, Xiangrong
    Huang, Xinyan
    Li, Shuo
    Chen, Puhua
    Liu, Xu
    PATTERN RECOGNITION, 2023, 143
  • [43] Surrogate-assisted evolutionary neural architecture search with network embedding
    Fan, Liang
    Wang, Handing
    COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (03) : 3313 - 3331
  • [44] EG-NAS: Neural Architecture Search with Fast Evolutionary Exploration
    Cai, Zicheng
    Chen, Lei
    Liu, Peng
    Ling, Tongtao
    Lai, Yutao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 11159 - 11167
  • [45] Fast Evolutionary Neural Architecture Search by Contrastive Predictor with Linear Regions
    Peng, Yameng
    Song, Andy
    Ciesielski, Vic
    Fayek, Haytham M.
    Chang, Xiaojun
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, GECCO 2023, 2023, : 1257 - 1266
  • [46] Knowledge reconstruction assisted evolutionary algorithm for neural network architecture search
    An, Yang
    Zhang, Changsheng
    Zheng, Xuanyu
    KNOWLEDGE-BASED SYSTEMS, 2023, 264
  • [47] Evolutionary Neural Architecture Search for Automatic Esophageal Lesion Identification and Segmentation
    Zhou Y.
    Yuan X.
    Zhang X.
    Liu W.
    Wu Y.
    Yen G.G.
    Hu B.
    Yi Z.
    IEEE Transactions on Artificial Intelligence, 2022, 3 (03): : 436 - 450
  • [48] Genetic-GNN: Evolutionary architecture search for Graph Neural Networks
    Shi, Min
    Tang, Yufei
    Zhu, Xingquan
    Huang, Yu
    Wilson, David
    Zhuang, Yuan
    Liu, Jianxun
    KNOWLEDGE-BASED SYSTEMS, 2022, 247
  • [49] Multi-population evolutionary neural architecture search with stacked generalization
    Song, Changwei
    Ma, Yongjie
    Xu, Yang
    Chen, Hong
    NEUROCOMPUTING, 2024, 587
  • [50] Evolutionary neural architecture search based on evaluation correction and functional units
    Shang, Ronghua
    Zhu, Songling
    Ren, Jinhong
    Liu, Hangcheng
    Jiao, Licheng
    KNOWLEDGE-BASED SYSTEMS, 2022, 251