Decoupled differentiable graph neural architecture search

被引:0
|
作者
Chen, Jiamin [1 ]
Gao, Jianliang [1 ]
Wu, Zhenpeng [1 ]
Al-Sabri, Raeed [1 ]
Oloulade, Babatounde Moctard [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural architecture search; Decoupled differentiable optimization; Supernet pruning; Graph neural network;
D O I
10.1016/j.ins.2024.120700
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The differentiable graph neural architecture search (GNAS) effectively designs graph neural networks (GNNs) efficiently and automatically with excellent performance based on different graph data distributions. Given a GNN search space containing multiple GNN component operation candidates, the differentiable GNAS method builds a mixed supernet using learnable architecture parameters multiplied by the GNN component operation candidates. When the mixed supernet completes optimization, the mixed supernet is pruned based on the best architecture parameters to efficiently identify the optimal GNN architecture in the GNN search space. However, the multiplicative relationship between the architecture parameters and the GNN component operation candidates introduces a coupled optimization bias into the weight optimization process of the mixed supernet GNN component operation candidates. This bias results in differentiable GNAS performance degradation. To solve the problem of coupled optimization bias in the previous differentiable GNAS method, we propose the D ecoupled D ifferentiable G raph N eural A rchitecture S earch (D 2 GNAS). It utilizes the Gumbel distribution as a bridge to decouple the weights optimization of supernet GNN component candidate operation and architecture parameters for constructing the decoupled differentiable GNN architecture sampler. The sampler is capable of selecting promising GNN architectures based on architecture parameters treated as sampling probabilities, and it is further optimized through the validation gradients derived from the sampled GNN architectures. Simultaneously, D 2 GNAS builds a singlepath supernet with a pruning strategy to compress the supernet progressively to improve search efficiency further. We conduct extensive experiments on multiple benchmark graphs. The experimental findings demonstrate that D 2 GNAS outperforms all established baseline methods, both manual GNN and GNAS methods, in terms of performance. Additionally, D 2 GNAS has a lower time complexity than previous differentiable GNAS methods. Based on the fair GNN search space, it achieves an average 5x efficiency improvement. Codes are available at https:// github.com/AutoMachine0/D2GNAS.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] AutoMaster: Differentiable Graph Neural Network Architecture Search for Collaborative Filtering Recommendation
    Mu, Caihong
    Yu, Haikun
    Zhang, Keyang
    Tian, Qiang
    Liu, Yi
    [J]. WEB ENGINEERING, ICWE 2024, 2024, 14629 : 82 - 98
  • [2] Graph Differentiable Architecture Search with Structure Learning
    Qin, Yijian
    Wang, Xin
    Zhang, Zeyang
    Zhu, Wenwu
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth
    Gao, Jianliang
    He, Changlong
    Chen, Jiamin
    Li, Qiutong
    Wang, Yili
    [J]. 35TH INTERNATIONAL CONFERENCE ON SCIENTIFIC AND STATISTICAL DATABASE MANAGEMENT, SSDBM 2023, 2023,
  • [4] An architecture entropy regularizer for differentiable neural architecture search
    Jing, Kun
    Chen, Luoyu
    Xu, Jungang
    [J]. NEURAL NETWORKS, 2023, 158 : 111 - 120
  • [5] Graph Neural Architecture Search
    Gao, Yang
    Yang, Hong
    Zhang, Peng
    Zhou, Chuan
    Hu, Yue
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 1403 - 1409
  • [6] DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural Networks
    Ding, Yuhui
    Yao, Quanming
    Zhao, Huan
    Zhang, Tong
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 279 - 288
  • [7] Neural Graph Embedding for Neural Architecture Search
    Li, Wei
    Gong, Shaogang
    Zhu, Xiatian
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4707 - 4714
  • [8] DASS: Differentiable Architecture Search for Sparse Neural Networks
    Mousavi, Hamid
    Loni, Mohammad
    Alibeigi, Mina
    Daneshtalab, Masoud
    [J]. ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [9] Differentiable neural architecture search with channel performance measurement
    Pan, Jie
    Zheng, Xue-Chi
    Zou, Xiao-Yu
    [J]. Kongzhi yu Juece/Control and Decision, 2024, 39 (07): : 2151 - 2160
  • [10] Understanding the wiring evolution in differentiable neural architecture search
    Xie, Sirui
    Hu, Shoukang
    Wang, Xinjiang
    Liu, Chunxiao
    Shi, Jianping
    Liu, Xunying
    Lin, Dahua
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130