Temporal Link Prediction via Auxiliary Graph Transformer

被引:0
|
作者
Tan, Tao [1 ,2 ]
Cao, Xianbin [1 ,2 ]
Song, Fansheng [1 ,2 ]
Chen, Shenwen [1 ,2 ]
Du, Wenbo [1 ,2 ]
Li, Yumeng [1 ,2 ]
机构
[1] Beihang Univ, Sch Elect & Informat Engn, Beijing 100191, Peoples R China
[2] State Key Lab CNS ATM Beijing, Beijing 100191, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Temporal link prediction; evolved edges; cross-; attention; auxiliary learning; PREDICTABILITY;
D O I
10.1109/TNSE.2024.3485093
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Temporal link prediction is fundamental for analyzing and predicting the behavior of real evolving complex systems. Recently, advances in graph learning for temporal network snapshots present a promising approach for predicting the evolving topology. However, previous methods only considered temporal-structural encoding of the entire network, which leads to the overshadowing of crucial evolutionary characteristics by massive invariant network structural information. In this paper, we delve into the evolving topology and propose an auxiliary learning framework to capture not only the overall network evolution patterns but also the time-varying regularity of the evolved edges. Specifically, we utilize a graph transformer to infer temporal networks, incorporating a temporal cross-attention mechanism to refine the dynamic graph representation. Simultaneously, a dynamic difference transformer is designed to infer the evolved edges, serving as an auxiliary task and being aggregated with graph representation to generate the final predicted result. Extensive experiments are conducted on eight real-world temporal networks from various scenarios. The results indicate that our auxiliary learning framework outperforms the baselines, demonstrating the superiority of the proposed method in extracting evolution patterns.
引用
收藏
页码:5954 / 5968
页数:15
相关论文
共 50 条
  • [41] Temporal Knowledge Graph Link Prediction Using Synergized Large Language Models and Temporal Knowledge Graphs
    Chen, Yao
    Shen, Yuming
    NEURAL COMPUTING FOR ADVANCED APPLICATIONS, NCAA 2024, PT III, 2025, 2183 : 33 - 45
  • [42] A novel recovery controllability method on temporal networks via temporal lost link prediction
    Qin, Yanjiao
    Yan, Ke
    JOURNAL OF COMPLEX NETWORKS, 2024, 12 (06)
  • [43] Full-Link Delivery Time Prediction in Logistics Using Federated Heterogeneous Graph Transformer
    Wang, Hai
    Zhou, Xiaolei
    Wang, Shuai
    Zhao, Xiaohui
    Deng, Xianjun
    Gong, Wei
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (06): : 6775 - 6789
  • [44] A dynamic graph representation learning based on temporal graph transformer
    Zhong, Ying
    Huang, Chenze
    ALEXANDRIA ENGINEERING JOURNAL, 2023, 63 : 359 - 369
  • [45] Graph Transformer for Drug Response Prediction
    Chu, Thang
    Nguyen, Thuy Trang
    Hai, Bui Duong
    Nguyen, Quang Huy
    Nguyen, Tuan
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (02) : 1065 - 1072
  • [46] A dynamic graph representation learning based on temporal graph transformer
    Zhong, Ying
    Huang, Chenze
    ALEXANDRIA ENGINEERING JOURNAL, 2023, 63 : 359 - 369
  • [47] Temporal motif-based attentional graph convolutional network for dynamic link prediction
    Wu, Zheng
    Chen, Hongchang
    Zhang, Jianpeng
    Pei, Yulong
    Huang, Zishuo
    INTELLIGENT DATA ANALYSIS, 2023, 27 (01) : 241 - 268
  • [48] Nonnegative matrix factorization algorithms for link prediction in temporal networks using graph communicability
    Ma, Xiaoke
    Sun, Penggang
    Qin, Guimin
    PATTERN RECOGNITION, 2017, 71 : 361 - 374
  • [49] Traffic Flow Prediction via Spatial Temporal Graph Neural Network
    Wang, Xiaoyang
    Ma, Yao
    Wang, Yiqi
    Jin, Wei
    Wang, Xin
    Tang, Jiliang
    Jia, Caiyan
    Yu, Jian
    WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 1082 - 1092
  • [50] Popularity Prediction of Online Contents via Cascade Graph and Temporal Information
    Shang, Yingdan
    Zhou, Bin
    Wang, Ye
    Li, Aiping
    Chen, Kai
    Song, Yichen
    Lin, Changjian
    AXIOMS, 2021, 10 (03)