Barriers for the performance of graph neural networks (GNN) in discrete random structures

被引:0
|
作者
Weitz, David [1 ]
机构
[1] MIT, Sloan Sch Management, Stat & Data Sci Ctr, Operat Res Ctr, Cambridge, MA 02140 USA
基金
美国国家科学基金会;
关键词
neural networks; graphs; optimization; algorithms; complexity; LOCAL ALGORITHMS; MAX-CUT; OPTIMIZATION; LIMITS;
D O I
10.1073/pnas.2314092120
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Recently, graph neural network (GNN)-based algorithms were proposed to solve a variety of combinatorial optimization problems [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell. 4, 367-377 (2022)]. GNN was tested in particular on randomly generated instances of these problems. The publication [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell. 4, 367-377 (2022)] stirred a debate whether the GNN-based method was adequately benchmarked against best prior methods. In particular, critical commentaries [M. C. Angelini, F. Ricci-Tersenghi, Nat. Mach. Intell. 5, 29-31 (2023)] and [S. Boettcher, Nat. Mach. Intell. 5, 24-25 (2023)] point out that a simple greedy algorithm performs better than the GNN. We do not intend to discuss the merits of arguments and counterarguments in these papers. Rather, in this note, we establish a fundamental limitation for running GNN on random instances considered in these references, for a broad range of choices of GNN architecture. Specifically, these barriers hold when the depth of GNN does not scale with graph size (we note that depth 2 was used in experiments in [M. J. Schuetz, J. K. Brubaker, H. G. Katzgraber, Nat. Mach. Intell. 4, 367-377 (2022)]), and importantly, these barriers hold regardless of any other parameters of GNN architecture. These limitations arise from the presence of the overlap gap property (OGP) phase transition, which is a barrier for many algorithms, including importantly local algorithms, of which GNN is an example. At the same time, some algorithms known prior to the introduction of GNN provide best results for these problems up to the OGP phase transition. This leaves very little space for GNN to outperform the known algorithms, and based on this, we side with the conclusions made in [M. C. Angelini, F. Ricci-Tersenghi, Nat. Mach. Intell. 5, 29-31 (2023)] and [S. Boettcher, Nat. Mach. Intell. 5, 24-25 (2023)].
引用
收藏
页数:3
相关论文
共 50 条
  • [1] Learning Discrete Structures for Graph Neural Networks
    Franceschi, Luca
    Niepert, Mathias
    Pontil, Massimiliano
    He, Xiao
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [2] GNN-surv: Discrete-Time Survival Prediction Using Graph Neural Networks
    Kim, So Yeon
    [J]. BIOENGINEERING-BASEL, 2023, 10 (09):
  • [3] Auto-GNN: Neural architecture search of graph neural networks
    Zhou, Kaixiong
    Huang, Xiao
    Song, Qingquan
    Chen, Rui
    Hu, Xia
    [J]. FRONTIERS IN BIG DATA, 2022, 5
  • [4] Policy-GNN: Aggregation Optimization for Graph Neural Networks
    Lai, Kwei-Herng
    Zha, Daochen
    Zhou, Kaixiong
    Hu, Xia
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 461 - 471
  • [5] HGK-GNN: Heterogeneous Graph Kernel based Graph Neural Networks
    Long, Qingqing
    Xu, Lingjun
    Fang, Zheng
    Song, Guojie
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 1129 - 1138
  • [6] GNN-Retro: Retrosynthetic Planning with Graph Neural Networks
    Han, Peng
    Zhao, Peilin
    Lu, Chan
    Huang, Junzhou
    Wu, Jiaxiang
    Shang, Shuo
    Yao, Bin
    Zhang, Xiangliang
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4014 - 4021
  • [7] Circuit-GNN: Graph Neural Networks for Distributed Circuit Design
    Zhang, Guo
    He, Hao
    Katabi, Dina
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [8] Genetic-GNN: Evolutionary architecture search for Graph Neural Networks
    Shi, Min
    Tang, Yufei
    Zhu, Xingquan
    Huang, Yu
    Wilson, David
    Zhuang, Yuan
    Liu, Jianxun
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 247
  • [9] DAG-GNN: DAG Structure Learning with Graph Neural Networks
    Yu, Yue
    Chen, Jie
    Gao, Tian
    Yu, Mo
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [10] FP-GNN: Adaptive FPGA accelerator for Graph Neural Networks
    Tian, Teng
    Zhao, Letian
    Wang, Xiaotian
    Wu, Qizhe
    Yuan, Wei
    Jin, Xi
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 136 : 294 - 310