Supervised Attention Using Homophily in Graph Neural Networks

被引:0
|
作者
Chatzianastasis, Michail [1 ]
Nikolentzos, Giannis [1 ]
Vazirgiannis, Michalis [1 ]
机构
[1] IP Paris, Ecole Polytech, LIX, Palaiseau, France
关键词
Graph Neural Networks; Graph Attention Networks; Supervised Attention; CLASSIFICATION;
D O I
10.1007/978-3-031-44216-2_47
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks have become the standard approach for dealing with learning problems on graphs. Among the different variants of graph neural networks, graph attention networks (GATs) have been applied with great success to different tasks. In the GAT model, each node assigns an importance score to its neighbors using an attention mechanism. However, similar to other graph neural networks, GATs aggregate messages from nodes that belong to different classes, and therefore produce node representations that are not well separated with respect to the different classes, which might hurt their performance. In this work, to alleviate this problem, we propose a new technique that can be incorporated into any graph attention model to encourage higher attention scores between nodes that share the same class label. We evaluate the proposed method on several node classification datasets demonstrating increased performance over standard baseline models.
引用
收藏
页码:576 / 586
页数:11
相关论文
共 50 条
  • [41] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310
  • [42] Transferable graph neural networks with deep alignment attention
    Xie, Ying
    Xu, Rongbin
    Yang, Yun
    INFORMATION SCIENCES, 2023, 643
  • [43] Hierarchical graph attention networks for semi-supervised node classification
    Kangjie Li
    Yixiong Feng
    Yicong Gao
    Jian Qiu
    Applied Intelligence, 2020, 50 : 3441 - 3451
  • [44] Graph neural networks with multiple kernel ensemble attention
    Zhang, Haimin
    Xu, Min
    KNOWLEDGE-BASED SYSTEMS, 2021, 229
  • [45] Semi-Supervised Graph Attention Networks for Event Representation Learning
    Rodrigues Mattos, Joao Pedro
    Marcacini, Ricardo M.
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1234 - 1239
  • [46] Gated Graph Neural Attention Networks for abstractive summarization
    Liang, Zeyu
    Du, Junping
    Shao, Yingxia
    Ji, Houye
    NEUROCOMPUTING, 2021, 431 : 128 - 136
  • [47] Self-supervised role learning for graph neural networks
    Aravind Sankar
    Junting Wang
    Adit Krishnan
    Hari Sundaram
    Knowledge and Information Systems, 2022, 64 : 2091 - 2121
  • [48] Embedding Imputation With Self-Supervised Graph Neural Networks
    Varolgunes, Uras
    Yao, Shibo
    Ma, Yao
    Yu, Dantong
    IEEE ACCESS, 2023, 11 : 70610 - 70620
  • [49] Comprehensive Study on Molecular Supervised Learning with Graph Neural Networks
    Hwang, Doyeong
    Yang, Soojung
    Kwon, Yongchan
    Lee, Kyung Hoon
    Lee, Grace
    Jo, Hanseok
    Yoon, Seyeol
    Ryu, Seongok
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2020, 60 (12) : 5936 - 5945
  • [50] Graph Stochastic Neural Networks for Semi-supervised Learning
    Wang, Haibo
    Zhou, Chuan
    Chen, Xin
    Wu, Jia
    Pan, Shirui
    Wang, Jilong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33