SE-GRU: Structure Embedded Gated Recurrent Unit Neural Networks for Temporal Link Prediction

被引:6
|
作者
Yin, Yanting [1 ]
Wu, Yajing [2 ]
Yang, Xuebing [2 ]
Zhang, Wensheng [2 ,3 ]
Yuan, Xiaojie [1 ]
机构
[1] Nankai Univ, Coll Comp Sci, Tianjin Key Lab Network & Data Secur Technol, Tianjin 300350, Peoples R China
[2] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
[3] Nankai Univ, Coll Comp Sci, Tianjin 300350, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Time-frequency analysis; Feature extraction; Predictive models; Optimization; Topology; Measurement; Logic gates; Temporal link prediction; dynamic graphs; graph embedding; neural networks;
D O I
10.1109/TNSE.2022.3164659
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Temporal link prediction on dynamic graphs is essential to various areas such as recommendation systems, social networks, and citation analysis, and thus attracts great attention in both research and industry fields. For complex graphs in real-world applications, although recent temporal link prediction methods perform well in predicting high-frequency and nearby connections, it becomes more challenging when considering low-frequency and earlier connections. In this work, we introduce a novel and elegant prediction architecture called Structure Embedded Gated Recurrent Unit (SE-GRU) neural networks, to strengthen the prediction robustness against frequency variation and occurrence delay of connections. The established SE-GRU embeds the structure for local topological characteristics to emphasize the different connection frequencies between nodes and captures the temporal dependencies to avoid losing valuable information caused by long-term changes. We realize neural network optimization considering three terms concerning reconstruction, structure, and evolution. The extensive experiments performed on three public datasets demonstrate the significant superiority of SE-GRU compared with 5 representative and state-of-the-art competitors under three evaluation metrics. The results validate the effectiveness and robustness of our proposed method, by showing that the frequencies and timestamps of connections have a little-to-no negative impact on prediction accuracy.
引用
收藏
页码:2495 / 2509
页数:15
相关论文
共 50 条
  • [1] Groundwater level prediction with meteorologically sensitive Gated Recurrent Unit (GRU) neural networks
    Gharehbaghi, Amin
    Ghasemlounia, Redvan
    Ahmadi, Farshad
    Albaji, Mohammad
    Journal of Hydrology, 2022, 612
  • [2] Groundwater level prediction with meteorologically sensitive Gated Recurrent Unit (GRU) neural networks
    Gharehbaghi, Amin
    Ghasemlounia, Redvan
    Ahmadi, Farshad
    Albaji, Mohammad
    JOURNAL OF HYDROLOGY, 2022, 612
  • [3] Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks
    Dey, Rahul
    Salem, Fathi M.
    2017 IEEE 60TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2017, : 1597 - 1600
  • [4] Displacement prediction of Jiuxianping landslide using gated recurrent unit (GRU) networks
    Wengang Zhang
    Hongrui Li
    Libin Tang
    Xin Gu
    Luqi Wang
    Lin Wang
    Acta Geotechnica, 2022, 17 : 1367 - 1382
  • [5] Displacement prediction of Jiuxianping landslide using gated recurrent unit (GRU) networks
    Zhang, Wengang
    Li, Hongrui
    Tang, Libin
    Gu, Xin
    Wang, Luqi
    Wang, Lin
    ACTA GEOTECHNICA, 2022, 17 (04) : 1367 - 1382
  • [6] Application of Gated Recurrent Unit (GRU) Neural Network for Smart Batch Production Prediction
    Li, Xuechen
    Ma, Xinfang
    Xiao, Fengchao
    Wang, Fei
    Zhang, Shicheng
    ENERGIES, 2020, 13 (22)
  • [7] Highway Speed Prediction Using Gated Recurrent Unit Neural Networks
    Jeong, Myeong-Hun
    Lee, Tae-Young
    Jeon, Seung-Bae
    Youm, Minkyo
    APPLIED SCIENCES-BASEL, 2021, 11 (07):
  • [8] New GRU from Convolutional Neural Network and Gated Recurrent Unit
    Atassi, A.
    El Azami, I.
    Sadiq, A.
    PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON DATA SCIENCE, E-LEARNING AND INFORMATION SYSTEMS 2018 (DATA'18), 2018,
  • [9] Minimal Gated Unit for Recurrent Neural Networks
    Guo-Bing Zhou
    Jianxin Wu
    Chen-Lin Zhang
    Zhi-Hua Zhou
    Machine Intelligence Research, 2016, 13 (03) : 226 - 234
  • [10] Minimal gated unit for recurrent neural networks
    Zhou G.-B.
    Wu J.
    Zhang C.-L.
    Zhou Z.-H.
    International Journal of Automation and Computing, 2016, 13 (3) : 226 - 234