SPARSIFYING THE UPDATE STEP IN GRAPH NEURAL NETWORKS

被引:0
|
作者
Lutzeyer, Johannes F. [1 ]
Wu, Changmin [1 ]
Vazirgiannis, Michalis [1 ]
机构
[1] Ecole Polytech, Inst Polytech Paris, Lab Informat, DaSciM, Paris, France
关键词
EXPANDER GRAPHS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Message-Passing Neural Networks (MPNNs), the most prominent Graph Neural Network (GNN) framework, celebrate much success in the analysis of graph-structured data. Concurrently, the sparsification of Neural Network models attracts a great amount of academic and industrial interest. In this paper we conduct a structured, empirical study of the effect of sparsification on the trainable part of MPNNs known as the Update step. To this end, we design a series of models to successively sparsify the linear transform in the Update step. Specifically, we propose the ExpanderGNN model with a tuneable sparsification rate and the Activation-Only GNN, which has no linear transform in the Update step. In agreement with a growing trend in the literature the sparsification paradigm is changed by initialising sparse neural network architectures rather than expensively sparsifying already trained architectures. Our novel benchmark models enable a better understanding of the influence of the Update step on model performance and outperform existing simplified benchmark models such as the Simple Graph Convolution. The ExpanderGNNs, and in some cases the Activation-Only models, achieve performance on par with their vanilla counterparts on several downstream tasks, while containing significantly fewer trainable parameters. Our code is publicly available at: https://github.com/ChangminWu/ExpanderGNN.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [1] Sparsifying Graph Neural Networks with Compressive Sensing
    Islam, Mohammad Munzurul
    Alawad, Mohammed
    PROCEEDING OF THE GREAT LAKES SYMPOSIUM ON VLSI 2024, GLSVLSI 2024, 2024, : 315 - 318
  • [2] Binary Domain Generalization for Sparsifying Binary Neural Networks
    Schiavone, Riccardo
    Galati, Francesco
    Zuluaga, Maria A.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT II, 2023, 14170 : 123 - 140
  • [3] Ranking and Sparsifying a Connection Graph
    Chung, Fan
    Zhao, Wenbo
    Kempton, Mark
    INTERNET MATHEMATICS, 2014, 10 (1-2) : 87 - 115
  • [4] Learning Cooperative Beamforming with Edge-Update Empowered Graph Neural Networks
    Wang, Yunqi
    Li, Yang
    Shi, Qingjiang
    Wu, Yik-Chung
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 5111 - 5116
  • [5] Graph neural networks
    Corso G.
    Stark H.
    Jegelka S.
    Jaakkola T.
    Barzilay R.
    Nature Reviews Methods Primers, 4 (1):
  • [6] Graph neural networks
    不详
    NATURE REVIEWS METHODS PRIMERS, 2024, 4 (01):
  • [7] Graph Neural Networks for Graph Drawing
    Tiezzi, Matteo
    Ciravegna, Gabriele
    Gori, Marco
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4668 - 4681
  • [8] Graph Mining with Graph Neural Networks
    Jin, Wei
    WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 1119 - 1120
  • [9] Graph Clustering with Graph Neural Networks
    Tsitsulin, Anton
    Palowitch, John
    Perozzi, Bryan
    Mueller, Emmanuel
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [10] Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks
    Gama, Fernando
    Isufi, Elvin
    Leus, Geert
    Ribeiro, Alejandro
    IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (06) : 128 - 138