Neural Architecture Search as Sparse Supernet

被引:0
|
作者
Wu, Yan [1 ]
Liu, Aoming [1 ]
Huang, Zhiwu [1 ]
Zhang, Siwei [1 ]
Van Gool, Luc [1 ,2 ]
机构
[1] Swiss Fed Inst Technol, Comp Vis Lab, Zurich, Switzerland
[2] Katholieke Univ Leuven, VISICS, Leuven, Belgium
关键词
SHRINKAGE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search. In particular, we model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints. The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes. To optimize the proposed sparse supernet, we exploit a hierarchical accelerated proximal gradient algorithm within a bi-level optimization framework. Extensive experiments on Convolutional Neural Network and Recurrent Neural Network search demonstrate that the proposed method is capable of searching for compact, general and powerful neural architectures.
引用
收藏
页码:10379 / 10387
页数:9
相关论文
共 50 条
  • [1] Multi-Objective Evolutionary Neural Architecture Search with Weight-Sharing Supernet
    Liang, Junchao
    Zhu, Ke
    Li, Yuan
    Li, Yun
    Gong, Yuejiao
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (14):
  • [2] One-Shot Neural Architecture Search by Dynamically Pruning Supernet in Hierarchical Order
    Zhang, Jianwei
    Li, Dong
    Wang, Lituan
    Zhang, Lei
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2021, 31 (07)
  • [3] Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift
    Peng, Jiefeng
    Zhang, Jiqi
    Li, Changlin
    Wang, Guangrun
    Liang, Xiaodan
    Lin, Liang
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 12334 - 12344
  • [4] DASS: Differentiable Architecture Search for Sparse Neural Networks
    Mousavi, Hamid
    Loni, Mohammad
    Alibeigi, Mina
    Daneshtalab, Masoud
    [J]. ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [5] Neural Architecture Search for Sparse DenseNets with Dynamic Compression
    O'Neill, Damien
    Xue, Bing
    Zhang, Mengjie
    [J]. GECCO'20: PROCEEDINGS OF THE 2020 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2020, : 386 - 394
  • [6] An Effective One-Shot Neural Architecture Search Method with Supernet Fine-Tuning for Image Classification
    Yuan, Gonglin
    Xue, Bing
    Zhang, Mengjie
    [J]. PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, GECCO 2023, 2023, : 615 - 623
  • [7] Accelerating one-shot neural architecture search via constructing a sparse search space
    Huang, Hongtao
    Chang, Xiaojun
    Yao, Lina
    [J]. Knowledge-Based Systems, 2024, 305
  • [8] ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse Coding
    Yang, Yibo
    Li, Hongyang
    You, Shan
    Wang, Fei
    Qian, Chen
    Lin, Zhouchen
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [9] You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization
    Zhang, Xinbang
    Huang, Zehao
    Wang, Naiyan
    Xiang, Shiming
    Pan, Chunhong
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (09) : 2891 - 2904
  • [10] SPARSE CNN ARCHITECTURE SEARCH (SCAS)
    Yeshwanth, V
    Deshwal, Ankur
    Krishnadasan, Sundeep
    Lee, Seungwon
    Song, Joonho
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,