Auto-GNN: Neural architecture search of graph neural networks

被引:15
|
作者
Zhou, Kaixiong [1 ]
Huang, Xiao [2 ]
Song, Qingquan [3 ]
Chen, Rui [4 ]
Hu, Xia [1 ]
机构
[1] Rice Univ, Dept Comp Sci, DATA Lab, Houston, TX 77005 USA
[2] Hong Kong Polytech Univ, Dept Comp, Kowloon, Hong Kong, Peoples R China
[3] LinkedIn, Sunnyvale, CA USA
[4] Samsung Res Amer, Silicon Valley, CA USA
来源
FRONTIERS IN BIG DATA | 2022年 / 5卷
关键词
graph neural networks; automated machine learning; neural architecture search; deep and scalable graph analysis; reinforcement learning;
D O I
10.3389/fdata.2022.1029307
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Graph neural networks (GNNs) have been widely used in various graph analysis tasks. As the graph characteristics vary significantly in real-world systems, given a specific scenario, the architecture parameters need to be tuned carefully to identify a suitable GNN. Neural architecture search (NAS) has shown its potential in discovering the effective architectures for the learning tasks in image and language modeling. However, the existing NAS algorithms cannot be applied efficiently to GNN search problem because of two facts. First, the large-step exploration in the traditional controller fails to learn the sensitive performance variations with slight architecture modifications in GNNs. Second, the search space is composed of heterogeneous GNNs, which prevents the direct adoption of parameter sharing among them to accelerate the search progress. To tackle the challenges, we propose an automated graph neural networks (AGNN) framework, which aims to find the optimal GNN architecture efficiently. Specifically, a reinforced conservative controller is designed to explore the architecture space with small steps. To accelerate the validation, a novel constrained parameter sharing strategy is presented to regularize the weight transferring among GNNs. It avoids training from scratch and saves the computation time. Experimental results on the benchmark datasets demonstrate that the architecture identified by AGNN achieves the best performance and search efficiency, comparing with existing human-invented models and the traditional search methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Neural Architecture Search Applied to Hybrid Morphological Neural Networks
    Gomes Weil, Victor Alexandre
    Florindo, Joao Batista
    [J]. INTELLIGENT SYSTEMS, PT II, 2022, 13654 : 631 - 645
  • [32] NASB: Neural Architecture Search for Binary Convolutional Neural Networks
    Zhu, Baozhou
    Al-Ars, Zaid
    Hofstee, H. Peter
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [33] Dynamic Heterogeneous Graph Attention Neural Architecture Search
    Zhang, Zeyang
    Zhang, Ziwei
    Wang, Xin
    Qin, Yijian
    Qin, Zhou
    Zhu, Wenwu
    [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 11307 - 11315
  • [34] GQNAS: Graph Q Network for Neural Architecture Search
    Qin, Yijian
    Wang, Xin
    Cui, Peng
    Zhu, Wenwu
    [J]. 2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1288 - 1293
  • [35] Multimodal Continual Graph Learning with Neural Architecture Search
    Cai, Jie
    Wang, Xin
    Guan, Chaoyu
    Tang, Yateng
    Xu, Jin
    Zhong, Bin
    Zhu, Wenwu
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1292 - 1300
  • [36] Large-Scale Graph Neural Architecture Search
    Guan, Chaoyu
    Wang, Xin
    Chen, Hong
    Zhang, Ziwei
    Zhu, Wenwu
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [37] Graph Neural Architecture Search Under Distribution Shifts
    Qin, Yijian
    Wang, Xin
    Zhang, Ziwei
    Xie, Pengtao
    Zhu, Wenwu
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [38] LGL-GNN: Learning Global and Local Information for Graph Neural Networks
    Li, Huan
    Wang, Boyuan
    Cui, Lixin
    Bai, Lu
    Hancock, Edwin R.
    [J]. STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2020, 2021, 12644 : 129 - 138
  • [39] Sketch-GNN: Scalable Graph Neural Networks with Sublinear Training Complexity
    Ding, Mucong
    Rabbani, Tahseen
    An, Bang
    Wang, Evan Z.
    Huang, Furong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [40] CCP-GNN: Competitive Covariance Pooling for Improving Graph Neural Networks
    Zhu, Pengfei
    Li, Jialu
    Dong, Zhe
    Hu, Qinghua
    Wang, Xiao
    Wang, Qilong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,