A unified framework for backpropagation-free soft and hard gated graph neural networks

被引:0
|
作者
Luca Pasa
Nicolò Navarin
Wolfgang Erb
Alessandro Sperduti
机构
[1] University of Padua,Department of Mathematics
[2] DISI,undefined
[3] University of Trento,undefined
来源
关键词
Graph convolutional networks; Graph neural network; Deep learning; Structured data; Machine learning on graphs;
D O I
暂无
中图分类号
学科分类号
摘要
We propose a framework for the definition of neural models for graphs that do not rely on backpropagation for training, thus making learning more biologically plausible and amenable to parallel implementation. Our proposed framework is inspired by Gated Linear Networks and allows the adoption of multiple graph convolutions. Specifically, each neuron is defined as a set of graph convolution filters (weight vectors) and a gating mechanism that, given a node and its topological context, generates the weight vector to use for processing the node’s attributes. Two different graph processing schemes are studied, i.e., a message-passing aggregation scheme where the gating mechanism is embedded directly into the graph convolution, and a multi-resolution one where neighboring nodes at different topological distances are jointly processed by a single graph convolution layer. We also compare the effectiveness of different alternatives for defining the context function of a node, i.e., based on hyperplanes or on prototypes, and using a soft or hard-gating mechanism. We propose a unified theoretical framework allowing us to theoretically characterize the proposed models’ expressiveness. We experimentally evaluate our backpropagation-free graph convolutional neural models on commonly adopted node classification datasets and show competitive performances compared to the backpropagation-based counterparts.
引用
收藏
页码:2393 / 2416
页数:23
相关论文
共 50 条
  • [1] A unified framework for backpropagation-free soft and hard gated graph neural networks
    Pasa, Luca
    Navarin, Nicolo
    Erb, Wolfgang
    Sperduti, Alessandro
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (04) : 2393 - 2416
  • [2] Backpropagation-free Graph Neural Networks
    Pasa, Luca
    Navarin, Nicolo
    Erb, Wolfgang
    Sperduti, Alessandro
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 388 - 397
  • [3] Backpropagation-free training of deep physical neural networks
    Momeni, Ali
    Rahmani, Babak
    Mallejac, Matthieu
    del Hougne, Philipp
    Fleury, Romain
    [J]. SCIENCE, 2023, 382 (6676) : 1297 - 1303
  • [4] Mechanism for feature learning in neural networks and backpropagation-free machine learning models
    Radhakrishnan, Adityanarayanan
    Beaglehole, Daniel
    Pandit, Parthe
    Belkin, Mikhail
    [J]. SCIENCE, 2024, 383 (6690) : 1461 - 1467
  • [5] UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks
    Huang, Jing
    Yang, Jie
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2563 - 2569
  • [6] A unified framework for convolution-based graph neural networks
    Pan, Xuran
    Han, Xiaoyan
    Wang, Chaofei
    Li, Zhuo
    Song, Shiji
    Huang, Gao
    Wu, Cheng
    [J]. PATTERN RECOGNITION, 2024, 155
  • [7] A Unified Framework to Learn Program Semantics with Graph Neural Networks
    Liu, Shangqing
    [J]. 2020 35TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING (ASE 2020), 2020, : 1364 - 1366
  • [8] Backpropagation-free 4D continuous ant-based neural topology search
    Elsaid, Abdelrahman
    Ricanek, Karl
    Lyu, Zimeng
    Ororbia, Alexander
    Desell, Travis
    [J]. APPLIED SOFT COMPUTING, 2023, 147
  • [9] Gated Graph Recurrent Neural Networks
    Ruiz, Luana
    Gama, Fernando
    Ribeiro, Alejandro
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 6303 - 6318
  • [10] CLASSIFICATION OF HARD RED WHEAT BY FEEDFORWARD BACKPROPAGATION NEURAL NETWORKS
    CHEN, YR
    DELWICHE, SR
    HRUSCHKA, WR
    [J]. CEREAL CHEMISTRY, 1995, 72 (03) : 317 - 319