Unifying topological structure and self-attention mechanism for node classification in directed networks

被引:0
|
作者
Peng, Yue [1 ,2 ]
Xia, Jiwen [1 ,2 ]
Liu, Dafeng [1 ,2 ]
Liu, Miao [1 ,2 ]
Xiao, Long [1 ,2 ]
Shi, Benyun [1 ,2 ]
机构
[1] Nanjing Tech Univ, Coll Comp & Informat Engn, Nanjing 211800, Peoples R China
[2] Nanjing Tech Univ, Coll Artificial Intelligence, Nanjing 211800, Peoples R China
来源
SCIENTIFIC REPORTS | 2025年 / 15卷 / 01期
基金
中国国家自然科学基金;
关键词
D O I
10.1038/s41598-024-84816-z
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Graph data is essential for modeling complex relationships among entities. Graph Neural Networks (GNNs) have demonstrated effectiveness in processing low-order undirected graph data; however, in complex directed graphs, relationships between nodes extend beyond first-order connections and encompass higher-order relationships. Additionally, the asymmetry introduced by edge directionality further complicates node interactions, presenting greater challenges for extracting node information. In this paper, We propose TWC-GNN, a novel graph neural network design, as a solution to this problem. TWC-GNN uses node degrees to define higher-order topological structures, assess node importance, and capture mutual interactions between central nodes and their adjacent counterparts. This approach improves our understanding of complex relationships within the network. Furthermore, by integrating self-attention mechanisms, TWC-GNN effectively gathers higher-order node information in addition to focusing on first-order node information. Experimental results demonstrate that the integration of topological structures and higher-order node information is crucial for the learning process of graph neural networks, particularly in directed graphs, leading to improved classification accuracy.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Temporal link prediction in directed networks based on self-attention mechanism
    Li, Jinsong
    Peng, Jianhua
    Liu, Shuxin
    Weng, Lintianran
    Li, Cong
    INTELLIGENT DATA ANALYSIS, 2022, 26 (01) : 173 - 188
  • [2] Cross-and-Diagonal Networks: An Indirect Self-Attention Mechanism for Image Classification
    Lyu, Jiahang
    Zou, Rongxin
    Wan, Qin
    Xi, Wang
    Yang, Qinglin
    Kodagoda, Sarath
    Wang, Shifeng
    SENSORS, 2024, 24 (07)
  • [3] When Self-attention and Topological Structure Make a Difference: Trajectory Modeling in Road Networks
    Zhu, Guoying
    Sang, Yu
    Chen, Wei
    Zhao, Lei
    WEB AND BIG DATA, PT III, APWEB-WAIM 2022, 2023, 13423 : 381 - 396
  • [4] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [5] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [6] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    Science China(Information Sciences), 2024, 67 (04) : 301 - 313
  • [7] Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 453 - 467
  • [8] On the Global Self-attention Mechanism for Graph Convolutional Networks
    Wang, Chen
    Deng, Chengyuan
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8531 - 8538
  • [9] Point cloud classification network based on self-attention mechanism
    Li, Yujie
    Cai, Jintong
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 104
  • [10] Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification
    Huang, Weichun
    Tao, Ziqiang
    Huang, Xiaohui
    Xiong, Liyan
    Yu, Jia
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021