Hybrid Architecture-Based Evolutionary Robust Neural Architecture Search

被引:0
|
作者
Yang, Shangshang [1 ]
Sun, Xiangkun [1 ]
Xu, Ke [1 ]
Liu, Yuanchao [2 ,3 ]
Tian, Ye [4 ]
Zhang, Xingyi [4 ]
机构
[1] Anhui Univ, Sch Artificial Intelligence, Key Lab Intelligent Comp & Signal Proc, Minist Educ, Hefei 230601, Peoples R China
[2] Northeastern Univ, State Key Lab Synthet Automat Proc Ind, Shenyang 110819, Peoples R China
[3] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Peoples R China
[4] Anhui Univ, Sch Comp Sci & Technol, Informat Mat & Intelligent Sensing Lab Anhui Prov, Hefei 230601, Peoples R China
基金
国家重点研发计划;
关键词
Computer architecture; Robustness; Search problems; Training; Space exploration; Network architecture; Transformers; Networks robustness; hybrid architectures; evolutionary neural architecture search; surrogate-assisted; ALGORITHM;
D O I
10.1109/TETCI.2024.3400867
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The robustness of neural networks in image classification is important to resist adversarial attacks. Although many researchers proposed to enhance the network robustness by inventing network training paradigms or designing network architectures, existing approaches are mainly based on a single type of networks, e.g., convolution neural networks (CNNs) or vision Transformer (ViT). Considering a recently revealed fact that CNNs and ViT can effectively defend against adversarial attacks transferred from each other, this paper aims to enhance network robustness by designing robust hybrid architecture networks containing different types of networks. To this end, we propose a hybrid architecture-based evolutionary neural architecture search approach for robust architecture design, termed HA-ENAS. Specifically, to combine or aggregate different types of networks in the same network framework, a multi-stage block-wise hybrid architecture network is first devised as the supernet, where three types of blocks (called convolution blocks, Transformer blocks, multi-layer perception blocks) are further designed as each block's candidate, and thus a hybrid architecture-based search space is established for HA-ENAS; then, the robust hybrid architecture search is formulated as an optimization problem maximizing both clean and adversarial accuracy of architectures, and an efficient multi-objective evolutionary algorithm is employed to solve the problem, where a supernet-based retraining evaluation and a surrogate model are used to mitigate coupled weight influence and reduce the whole search cost. Experimental results show that the hybrid architectures found by the proposed HA-ENAS outperform state-of-the-art single-type architectures in terms of clean accuracy and adversarial accuracy under a variety of common attacks.
引用
收藏
页码:2919 / 2934
页数:16
相关论文
共 50 条
  • [1] Efficient evolutionary neural architecture search based on hybrid search space
    Gong, Tao
    Ma, Yongjie
    Xu, Yang
    Song, Changwei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (08) : 3313 - 3326
  • [2] Effective, Efficient and Robust Neural Architecture Search Effective, Efficient and Robust Neural Architecture Search
    Yue, Zhixiong
    Lin, Baijiong
    Zhang, Yu
    Liang, Christy
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [3] Evolutionary approximation and neural architecture search
    Pinos, Michal
    Mrazek, Vojtech
    Sekanina, Lukas
    [J]. GENETIC PROGRAMMING AND EVOLVABLE MACHINES, 2022, 23 (03) : 351 - 374
  • [4] A Survey on Evolutionary Neural Architecture Search
    Liu, Yuqiao
    Sun, Yanan
    Xue, Bing
    Zhang, Mengjie
    Yen, Gary G.
    Tan, Kay Chen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (02) : 550 - 570
  • [5] Evolutionary Neural Architecture Search and Applications
    Sun, Yanan
    Zhang, Mengjie
    Yen, Gary G.
    [J]. IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2021, 16 (03) : 8 - 9
  • [6] Evolutionary Recurrent Neural Architecture Search
    Tian, Shuo
    Hu, Kai
    Guo, Shasha
    Li, Shiming
    Wang, Lei
    Xu, Weixia
    [J]. IEEE EMBEDDED SYSTEMS LETTERS, 2021, 13 (03) : 110 - 113
  • [7] Evolutionary approximation and neural architecture search
    Michal Pinos
    Vojtech Mrazek
    Lukas Sekanina
    [J]. Genetic Programming and Evolvable Machines, 2022, 23 : 351 - 374
  • [8] Neural Architecture Search Based on Evolutionary Algorithms with Fitness Approximation
    Pan, Chao
    Yao, Xin
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] RENAS: Reinforced Evolutionary Neural Architecture Search
    Chen, Yukang
    Meng, Gaofeng
    Zhang, Qian
    Xiang, Shiming
    Huang, Chang
    Mu, Lisen
    Wang, Xinggang
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 4782 - 4791
  • [10] Novelty Driven Evolutionary Neural Architecture Search
    Sinha, Nilotpal
    Chen, Kuan-Wen
    [J]. PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2022, 2022, : 671 - 674