Breaking the Limits of Message Passing Graph Neural Networks

被引:0
|
作者
Balcilar, Muhammet [1 ,2 ]
Heroux, Pierre [1 ]
Gauzere, Benoit [3 ]
Vasseur, Pascal [1 ,4 ]
Adam, Sebastien [1 ]
Honeine, Paul [1 ]
机构
[1] Univ Rouen Normandy, LITIS Lab, Mont St Aignan, France
[2] InterDigital, Rennes, France
[3] INSA Rouen Normandy, LITIS Lab, St Etienne Du Rouvray, France
[4] Univ Picardie Jules Verne, MIS Lab, Amiens, France
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since the Message Passing (Graph) Neural Networks (MPNNs) have a linear complexity with respect to the number of nodes when applied to sparse graphs, they have been widely implemented and still raise a lot of interest even though their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). In this paper, we show that if the graph convolution supports are designed in spectral-domain by a nonlinear custom function of eigenvalues and masked with an arbitrary large receptive field, the MPNN is theoretically more powerful than the 1-WL test and experimentally as powerful as a 3-WL existing models, while remaining spatially localized. Moreover, by designing custom filter functions, outputs can have various frequency components that allow the convolution process to learn different relationships between a given input graph signal and its associated properties. So far, the best 3-WL equivalent graph neural networks have a computational complexity in O(n(3)) with memory usage in O(n(2)), consider non-local update mechanism and do not provide the spectral richness of output profile. The proposed method overcomes all these aforementioned problems and reaches state-of-the-art results in many downstream tasks.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Polarized message-passing in graph neural networks
    He, Tiantian
    Liu, Yang
    Ong, Yew-Soon
    Wu, Xiaohu
    Luo, Xin
    ARTIFICIAL INTELLIGENCE, 2024, 331
  • [2] Revisiting the Message Passing in Heterophilous Graph Neural Networks
    Zheng, Zhuonan
    Bei, Yuanchen
    Zhou, Sheng
    Ma, Yao
    Gu, Ming
    Xu, Hongjia
    Lai, Chengyu
    Chen, Jiawei
    Bu, Jiajun
    arXiv, 2024,
  • [3] Learning Graph Distances with Message Passing Neural Networks
    Riba, Pau
    Fischer, Andreas
    Llados, Josep
    Fornes, Alicia
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2239 - 2244
  • [4] Hierarchical message-passing graph neural networks
    Zhong, Zhiqiang
    Li, Cheng-Te
    Pang, Jun
    DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 37 (01) : 381 - 408
  • [5] Hierarchical message-passing graph neural networks
    Zhiqiang Zhong
    Cheng-Te Li
    Jun Pang
    Data Mining and Knowledge Discovery, 2023, 37 : 381 - 408
  • [6] Graph Neural Networks with Non-Recursive Message Passing
    Tan, Qiaoyu
    Zhang, Xin
    Du, Jiahe
    Huang, Xiao
    IEEE International Conference on Data Mining Workshops, ICDMW, 2023, : 506 - 514
  • [7] Graph Neural Networks with Non-Recursive Message Passing
    Tan, Qiaoyu
    Zhang, Xin
    Du, Jiahe
    Huang, Xiao
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 506 - 514
  • [8] Redundancy-Free Message Passing for Graph Neural Networks
    Chen, Rongqin
    Zhang, Shenghui
    Hou, Leong U.
    Li, Ye
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Proximity Graph Networks: Predicting Ligand Affinity with Message Passing Neural Networks
    Gale-Day, Zachary J.
    Shub, Laura
    Chuang, Kangway V.
    Keiser, Michael J.
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2024, 64 (14) : 5439 - 5450
  • [10] Addressing data association by message passing over graph neural networks
    Tedeschini, Bernardo Camajori
    Brambilla, Mattia
    Barbieri, Luca
    Nicoli, Monica
    2022 25TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2022), 2022,