HGATE: Heterogeneous Graph Attention Auto-Encoders

被引:17
|
作者
Wang, Wei [1 ]
Suo, Xiaoyang [1 ]
Wei, Xiangyu [1 ]
Wang, Bin [2 ]
Wang, Hao [3 ]
Dai, Hong-Ning [4 ]
Zhang, Xiangliang [5 ]
机构
[1] Beijing Jiaotong Univ, Beijing Key Lab Secur & Privacy Intelligent Transp, Beijing 100044, Peoples R China
[2] Zhejiang Key Lab Multidimens Percept Technol Appli, Hangzhou 310053, Peoples R China
[3] Zhejiang Lab, Res Ctr Opt Fiber Sensing, Hangzhou 310000, Peoples R China
[4] Lingnan Univ, Dept Comp & Decis Sci, Hong Kong, Peoples R China
[5] Univ Notre Dame, Dept Comp Sci & Engn, Notre Dame, IN USA
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Semantics; Representation learning; Graph neural networks; Decoding; Unsupervised learning; Task analysis; Labeling; Graph embedding representation; heterogeneous graphs; hierarchical attention; transductive learning; inductive learning;
D O I
10.1109/TKDE.2021.3138788
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph auto-encoder is considered a framework for unsupervised learning on graph-structured data by representing graphs in a low dimensional space. It has been proved very powerful for graph analytics. In the real world, complex relationships in various entities can be represented by heterogeneous graphs that contain more abundant semantic information than homogeneous graphs. In general, graph auto-encoders based on homogeneous graphs are not applicable to heterogeneous graphs. In addition, little work has been done to evaluate the effect of different semantics on node embedding in heterogeneous graphs for unsupervised graph representation learning. In this work, we propose a novel Heterogeneous Graph Attention Auto-Encoders (HGATE) for unsupervised representation learning on heterogeneous graph-structured data. Based on the consideration of semantic information, our architecture of HGATE reconstructs not only the edges of the heterogeneous graph but also node attributes, through stacked encoder/decoder layers. Hierarchical attention is used to learn the relevance between a node and its meta-path based neighbors, and the relevance among different meta-paths. HGATE is applicable to transductive learning as well as inductive learning. Node classification and link prediction experiments on real-world heterogeneous graph datasets demonstrate the effectiveness of HGATE for both transductive and inductive tasks.
引用
收藏
页码:3938 / 3951
页数:14
相关论文
共 50 条
  • [31] Predicting circRNA-drug sensitivity associations by learning multimodal networks using graph auto-encoders and attention mechanism
    Yang, Bo
    Chen, Hailin
    BRIEFINGS IN BIOINFORMATICS, 2023, 24 (01)
  • [32] Prediction of circRNA-drug sensitivity using random auto-encoders and multi-layer heterogeneous graph transformers
    Yinbo Liu
    Xinxin Ren
    Jun Li
    Xiao Chen
    Xiaolei Zhu
    Applied Intelligence, 2025, 55 (4)
  • [33] Graph Convolutional Auto-Encoders for Predicting Novel lncRNA-Disease Associations
    Silva, Ana B. O., V
    Spinosa, E. J.
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2022, 19 (04) : 2264 - 2271
  • [34] Scoring and Classifying with Gated Auto-Encoders
    Im, Daniel Jiwoong
    Taylor, Graham W.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2015, PT I, 2015, 9284 : 533 - 545
  • [35] Consistency Regularization for Variational Auto-Encoders
    Sinha, Samarth
    Dieng, Adji B.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [36] Anomaly Node Detection Method Based on Variational Graph Auto-Encoders in Attribute Networks
    Li Z.
    Jin X.
    Wang Y.
    Meng L.
    Zhuang C.
    Sun Z.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2022, 35 (01): : 17 - 25
  • [37] Learning Concept Prerequisite Relations from Educational Data via Multi-Head Attention Variational Graph Auto-Encoders
    Zhang, Juntao
    Lin, Nanzhou
    Zhang, Xuelong
    Song, Wei
    Yang, Xiandi
    Peng, Zhiyong
    WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 1377 - 1385
  • [38] HSAE: A Hessian regularized sparse auto-encoders
    Liu, Weifeng
    Ma, Tengzhou
    Tao, Dapeng
    You, Jane
    NEUROCOMPUTING, 2016, 187 : 59 - 65
  • [39] Radon-Sobolev Variational Auto-Encoders
    Turinici, Gabriel
    NEURAL NETWORKS, 2021, 141 : 294 - 305
  • [40] Weakly supervised setting for learning concept prerequisite relations using multi-head attention variational graph auto-encoders
    Zhang, Juntao
    Lan, Hai
    Yang, Xiandi
    Zhang, Shuaichao
    Song, Wei
    Peng, Zhiyong
    KNOWLEDGE-BASED SYSTEMS, 2022, 247