Graph Information Vanishing Phenomenon in Implicit Graph Neural Networks

被引:0
|
作者
He, Silu [1 ]
Cao, Jun [1 ]
Yuan, Hongyuan [1 ]
Chen, Zhe [1 ]
Gao, Shijuan [1 ,2 ]
Li, Haifeng [1 ]
机构
[1] Cent South Univ, Sch Geosci & Info Phys, Changsha 410083, Peoples R China
[2] Cent South Univ, Informat & Network Ctr, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
graph neural network; graph information; joint training; graph curvature; 68-XX; CONVOLUTIONAL NETWORKS; RICCI CURVATURE;
D O I
10.3390/math12172659
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Graph neural networks (GNNs) have been highly successful in graph representation learning. The goal of GNNs is to enrich node representations by aggregating information from neighboring nodes. Much work has attempted to improve the quality of aggregation by introducing a variety of graph information with representational capabilities. The class of GNNs that improves the quality of aggregation by encoding graph information with representational capabilities into the weights of neighboring nodes through different learnable transformation structures (LTSs) are referred to as implicit GNNs. However, we argue that LTSs only transform graph information into the weights of neighboring nodes in the direction that minimizes the loss function during the learning process and does not actually utilize the effective properties of graph information, a phenomenon that we refer to as graph information vanishing (GIV). To validate this point, we perform thousands of experiments on seven node classification benchmark datasets. We first replace the graph information utilized by five implicit GNNs with random values and surprisingly observe that the variation range of accuracies is less than +/- 0.3%. Then, we quantitatively characterize the similarity of the weights generated from graph information and random values by cosine similarity, and the cosine similarities are greater than 0.99. The empirical experiments show that graph information is equivalent to initializing the input of LTSs. We believe that graph information as an additional supervised signal to constrain the training of GNNs can effectively solve GIV. Here, we propose GinfoNN, which utilizes both labels and discrete graph curvature as supervised signals to jointly constrain the training of the model. The experimental results show that the classification accuracies of GinfoNN improve by two percentage points over baselines on large and dense datasets.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks
    Gama, Fernando
    Isufi, Elvin
    Leus, Geert
    Ribeiro, Alejandro
    IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (06) : 128 - 138
  • [32] Imbalanced Graph Classification via Graph-of-Graph Neural Networks
    Wang, Yu
    Zhao, Yuying
    Shah, Neil
    Derr, Tyler
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 2068 - 2077
  • [33] Graph Structure Learning for Robust Graph Neural Networks
    Jin, Wei
    Ma, Yao
    Liu, Xiaorui
    Tang, Xianfeng
    Wang, Suhang
    Tang, Jiliang
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 66 - 74
  • [34] Heterogeneous graph neural networks with denoising for graph embeddings
    Dong, Xinrui
    Zhang, Yijia
    Pang, Kuo
    Chen, Fei
    Lu, Mingyu
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [35] Graph Neural Networks for Brain Graph Learning: A Survey
    Luo, Xuexiong
    Wu, Jia
    Yang, Jian
    Xue, Shan
    Beheshti, Amin
    Sheng, Quan Z.
    McAlpine, David
    Sowman, Paul
    Giral, Alexis
    Yu, Philip S.
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8170 - 8178
  • [36] Spectral Clustering with Graph Neural Networks for Graph Pooling
    Bianchi, Filippo Maria
    Grattarola, Daniele
    Alippi, Cesare
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [37] Heterogeneous Graph Structure Learning for Graph Neural Networks
    Zhao, Jianan
    Wang, Xiao
    Shi, Chuan
    Hu, Binbin
    Song, Guojie
    Ye, Yanfang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4697 - 4705
  • [38] Scalable Graph Neural Networks with Deep Graph Library
    Zheng, Da
    Wang, Minjie
    Gan, Quan
    Song, Xiang
    Zhang, Zheng
    Karypis, Geroge
    WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 1141 - 1142
  • [39] KerGNNs: Interpretable Graph Neural Networks with Graph Kernels
    Feng, Aosong
    You, Chenyu
    Wang, Shiqiang
    Tassiulas, Leandros
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6614 - 6622
  • [40] Dynamic Graph Segmentation for Deep Graph Neural Networks
    Kang, Johan Kok Zhi
    Yang, Suwei
    Venkatesan, Suriya
    Tan, Sien Yi
    Cheng, Feng
    He, Bingsheng
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4601 - 4611