Transferable graph neural networks with deep alignment attention

被引:2
|
作者
Xie, Ying [1 ,2 ]
Xu, Rongbin [1 ,2 ]
Yang, Yun [3 ]
机构
[1] Putian Univ, Sch Mech Elect & Informat Engn, Putian, Peoples R China
[2] Fujian Key Lab Financial Informat Proc, Fujian, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Melbourne, Australia
基金
中国国家自然科学基金;
关键词
GNNs; Transfer learning; Attention mechanism;
D O I
10.1016/j.ins.2023.119232
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
When analyzing graph-structured data, graph neural networks (GNNs) use well-designed propagation mechanism to extract node features. However, these methods only incorporate data from nodes' fixed neighborhoods or whole graph, not accounting for the influence of important nodes outside of the neighborhoods. We offer transferable graph neural networks with deep alignment attention (TGDAA), which is a novel neural network framework that designs aligned attention to assign different weights to related important nodes. In such a scenario, we concurrently remove a number of graph neural network obstacles and make TGDAA accessible for transfer learning issues. Extensive experiments in four aspects: traditional metrics, attention -related methods, varying label rates and ablation study show that the suggested TGDAA performs better than ten state-of-the-art techniques on ten benchmark network datasets.
引用
下载
收藏
页数:15
相关论文
共 50 条
  • [1] Deep Attention Diffusion Graph Neural Networks for Text Classification
    Liu, Yonghao
    Guan, Renchu
    Giunchiglia, Fausto
    Liang, Yanchun
    Feng, Xiaoyue
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 8142 - 8152
  • [2] Deep multi-graph neural networks with attention fusion for recommendation
    Song, Yuzhi
    Ye, Hailiang
    Li, Ming
    Cao, Feilong
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 191
  • [3] How transferable are features in deep neural networks?
    Yosinski, Jason
    Clune, Jeff
    Bengio, Yoshua
    Lipson, Hod
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [4] TRANSFERABLE POLICIES FOR LARGE SCALE WIRELESS NETWORKS WITH GRAPH NEURAL NETWORKS
    Eisen, Mark
    Ribeiro, Alejandro
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 5040 - 5044
  • [5] Simple and deep graph attention networks
    Su, Guangxin
    Wang, Hanchen
    Zhang, Ying
    Zhang, Wenjie
    Lin, Xuemin
    KNOWLEDGE-BASED SYSTEMS, 2024, 293
  • [6] SEA: Graph Shell Attention in Graph Neural Networks
    Frey, Christian M. M.
    Ma, Yunpu
    Schubert, Matthias
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT II, 2023, 13714 : 326 - 343
  • [7] Transferable equivariant graph neural networks for the Hamiltonians of molecules and solids
    Zhong, Yang
    Yu, Hongyu
    Su, Mao
    Gong, Xingao
    Xiang, Hongjun
    NPJ COMPUTATIONAL MATERIALS, 2023, 9 (01)
  • [8] Transferable equivariant graph neural networks for the Hamiltonians of molecules and solids
    Yang Zhong
    Hongyu Yu
    Mao Su
    Xingao Gong
    Hongjun Xiang
    npj Computational Materials, 9
  • [9] Graph Neural Networks for Multiparallel Word Alignment
    Imani, Ayyoob
    Senel, Lutfi Kerem
    Sabet, Masoud Jalili
    Yvon, Francois
    Schuetze, Hinrich
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 1384 - 1396
  • [10] On the distribution alignment of propagation in graph neural networks
    Zheng, Qinkai
    Xia, Xiao
    Zhang, Kun
    Kharlamov, Evgeny
    Dong, Yuxiao
    AI OPEN, 2022, 3 : 218 - 228