Balancing structure and position information in Graph Transformer network with a learnable node

被引:3
|
作者
Hoang, Thi Linh [1 ]
Ta, Viet Cuong [1 ]
机构
[1] VNU Univ Engn & Technol, Human Machine Interact Lab, 144 Xuan Thuy St, Hanoi 11300, Vietnam
关键词
Graph Transformer; Laplacian positional encoding; Subgraph extraction; Structural encoding;
D O I
10.1016/j.eswa.2023.122096
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Transformer-based graph neural network models have achieved remarkable results in graph representation learning in recent years. One of the main challenges in graph representation learning with Transformer architecture is the non-existence of a universal positional encoding. Standard position encoding methods usually evolve the usage of the graph Laplacian matrix eigenvectors. However, exploiting the structural information from these eigenvectors failed to perform graph learning tasks requiring the node's local structures. In our work, we propose a novel node encoding that leverages both the node's global position information and the node's local structural information, which can generalize well for a wide range of graph learning tasks. The global position encoding branch operates on the eigenvalues and eigenvectors of the Laplacian matrix of the entire graph. The structural encoding branch is derived through the spectral-based encoding of the local subgraph. It represents the local properties, which are usually omitted in the Laplacian position encoding because of the cutoff of high graph frequencies. Two encoding branches are designed with learnable weights and mapped into predefined embedding spaces. Then, a weighted combination is employed to create a unique location encoding for each node. We validate the efficiency of our proposed encoding through various graph learning datasets, including node classification, link prediction, graph classification, and graph regression tasks. The overall results demonstrate that our structural and positional encoding can balance between the local and global structural information and outperforms most of the baseline models.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Learnable Graph Convolutional Network With Semisupervised Graph Information Bottleneck
    Zhong, Luying
    Chen, Zhaoliang
    Wu, Zhihao
    Du, Shide
    Chen, Zheyi
    Wang, Shiping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 433 - 446
  • [2] Information Fusion of Topological Structure and Node Features in Graph Neural Network
    Zhang, Hongwei
    Wang, Can
    Xia, Yuanqing
    Yan, Tijin
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 8204 - 8209
  • [3] NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification
    Wu, Qitian
    Zhao, Wentao
    Li, Zenan
    Wipf, David
    Yan, Junchi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] Discovering latent node Information by graph attention network
    Gu, Weiwei
    Gao, Fei
    Lou, Xiaodan
    Zhang, Jiang
    SCIENTIFIC REPORTS, 2021, 11 (01)
  • [5] Discovering latent node Information by graph attention network
    Weiwei Gu
    Fei Gao
    Xiaodan Lou
    Jiang Zhang
    Scientific Reports, 11
  • [6] Deeper Exploiting Graph Structure Information by Discrete Ricci Curvature in a Graph Transformer
    Lai, Xin
    Liu, Yang
    Qian, Rui
    Lin, Yong
    Ye, Qiwei
    ENTROPY, 2023, 25 (06)
  • [7] Improving Graph Convolutional Network with Learnable Edge Weights and Edge-Node Co-Embedding for Graph Anomaly Detection
    Tan, Xiao
    Yang, Jianfeng
    Zhao, Zhengang
    Xiao, Jinsheng
    Li, Chengwang
    SENSORS, 2024, 24 (08)
  • [8] Gapformer: Graph Transformer with Graph Pooling for Node Classification
    Liu, Chuang
    Zhan, Yibing
    Ma, Xueqi
    Ding, Liang
    Tao, Dapeng
    Wu, Jia
    Hu, Wenbin
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 2196 - 2205
  • [9] Graph Convolutional Network with Learnable Message Propagation Mechanism
    School of Computer and Data Science, Fuzhou University, Fuzhou, China
    Int. Conf. Inf., Cybern., Comput. Soc. Syst., ICCSS, 1600, (1-5):
  • [10] Improving Graph Neural Network with Learnable Permutation Pooling
    Jin, Yu
    Jaja, Joseph F.
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 682 - 689