SPARSIFYING THE UPDATE STEP IN GRAPH NEURAL NETWORKS

被引:0
|
作者
Lutzeyer, Johannes F. [1 ]
Wu, Changmin [1 ]
Vazirgiannis, Michalis [1 ]
机构
[1] Ecole Polytech, Inst Polytech Paris, Lab Informat, DaSciM, Paris, France
关键词
EXPANDER GRAPHS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Message-Passing Neural Networks (MPNNs), the most prominent Graph Neural Network (GNN) framework, celebrate much success in the analysis of graph-structured data. Concurrently, the sparsification of Neural Network models attracts a great amount of academic and industrial interest. In this paper we conduct a structured, empirical study of the effect of sparsification on the trainable part of MPNNs known as the Update step. To this end, we design a series of models to successively sparsify the linear transform in the Update step. Specifically, we propose the ExpanderGNN model with a tuneable sparsification rate and the Activation-Only GNN, which has no linear transform in the Update step. In agreement with a growing trend in the literature the sparsification paradigm is changed by initialising sparse neural network architectures rather than expensively sparsifying already trained architectures. Our novel benchmark models enable a better understanding of the influence of the Update step on model performance and outperform existing simplified benchmark models such as the Simple Graph Convolution. The ExpanderGNNs, and in some cases the Activation-Only models, achieve performance on par with their vanilla counterparts on several downstream tasks, while containing significantly fewer trainable parameters. Our code is publicly available at: https://github.com/ChangminWu/ExpanderGNN.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Stochastic Graph Neural Networks
    Gao, Zhan
    Isufi, Elvin
    Ribeiro, Alejandro
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4428 - 4443
  • [42] Implicit Graph Neural Networks
    Gu, Fangda
    Chang, Heng
    Zhu, Wenwu
    Sojoudi, Somayeh
    El Ghaoui, Laurent
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [43] Survey on Graph Neural Networks
    Gkarmpounis, Georgios
    Vranis, Christos
    Vretos, Nicholas
    Daras, Petros
    IEEE ACCESS, 2024, 12 : 128816 - 128832
  • [44] Nested Graph Neural Networks
    Zhang, Muhan
    Li, Pan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [45] A comparison between Recursive Neural Networks and Graph Neural Networks
    Di Massa, Vincenzo
    Monfardini, Gabriele
    Sarti, Lorenzo
    Scarselli, Franco
    Maggini, Marco
    Gori, Marco
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 778 - +
  • [46] Beyond graph neural networks with lifted relational neural networks
    Sourek, Gustav
    Zelezny, Filip
    Kuzelka, Ondrej
    MACHINE LEARNING, 2021, 110 (07) : 1695 - 1738
  • [47] Beyond graph neural networks with lifted relational neural networks
    Gustav Šourek
    Filip Železný
    Ondřej Kuželka
    Machine Learning, 2021, 110 : 1695 - 1738
  • [48] Graph matching as a graph convolution operator for graph neural networks
    Martineau, Chloé
    Raveaux, Romain
    Conte, Donatello
    Venturini, Gilles
    Pattern Recognition Letters, 2021, 149 : 59 - 66
  • [49] A Relationship-Aware Feature Update Method for Enhanced Graph-Based Neural Networks
    Huang, Conggui
    IEEE Access, 2025, 13 : 37096 - 37107
  • [50] Graph matching as a graph convolution operator for graph neural networks
    Martineau, Chloe
    Raveaux, Romain
    Conte, Donatello
    Venturini, Gilles
    PATTERN RECOGNITION LETTERS, 2021, 149 : 59 - 66