MECCH: Metapath Context Convolution-based Heterogeneous Graph Neural Networks

被引:3
|
作者
Fu, Xinyu [1 ]
King, Irwin [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
关键词
Graph neural networks; Heterogeneous information networks; Graph representation learning;
D O I
10.1016/j.neunet.2023.11.030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Heterogeneous graph neural networks (HGNNs) were proposed for representation learning on structural data with multiple types of nodes and edges. To deal with the performance degradation issue when HGNNs become deep, researchers combine metapaths into HGNNs to associate nodes closely related in semantics but far apart in the graph. However, existing metapath-based models suffer from either information loss or high computation costs. To address these problems, we present a novel Metapath Context Convolution-based Heterogeneous Graph Neural Network (MECCH). MECCH leverages metapath contexts, a new kind of graph structure that facilitates lossless node information aggregation while avoiding any redundancy. Specifically, MECCH applies three novel components after feature preprocessing to extract comprehensive information from the input graph efficiently: (1) metapath context construction, (2) metapath context encoder, and (3) convolutional metapath fusion. Experiments on five real-world heterogeneous graph datasets for node classification and link prediction show that MECCH achieves superior prediction accuracy compared with state-of-the-art baselines with improved computational efficiency. The code is available at https://github.com/cynricfu/MECCH.
引用
收藏
页码:266 / 275
页数:10
相关论文
共 50 条
  • [41] RepGCN: A Novel Graph Convolution-Based Model for Gait Recognition with Accompanying Behaviors
    Mei, Zijie
    Mei, Zhanyong
    Tong, He
    Yi, Sijia
    Zeng, Hui
    Li, Yingyi
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT V, 2024, 14429 : 147 - 158
  • [42] Heterogeneous Graph Structure Learning for Graph Neural Networks
    Zhao, Jianan
    Wang, Xiao
    Shi, Chuan
    Hu, Binbin
    Song, Guojie
    Ye, Yanfang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4697 - 4705
  • [43] InteractE: Improving Convolution-Based Knowledge Graph Embeddings by Increasing Feature Interactions
    Vashishth, Shikhar
    Sanya, Soumya
    Nitin, Vikram
    Agrawal, Nilesh
    Talukdar, Partha
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 3009 - 3016
  • [44] Self-supervised learning for heterogeneous graph via structure information based on metapath
    Ma, Shuai
    Liu, Jian-wei
    Zuo, Xin
    APPLIED SOFT COMPUTING, 2023, 143
  • [45] Influence Maximization Based on Adaptive Graph Convolution Neural Network in Social Networks
    Liu, Wei
    Wang, Saiwei
    Ding, Jiayi
    ELECTRONICS, 2024, 13 (16)
  • [46] An Android Malware Detection Method Based on Metapath Aggregated Graph Neural Network
    Li, Qingru
    Zhang, Yufei
    Wang, Fangwei
    Wang, Changguang
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT III, 2024, 14489 : 344 - 357
  • [47] Metapath and syntax-aware heterogeneous subgraph neural networks for spam review detection
    Zhang, Zhiqiang
    Dong, Yuhang
    Wu, Haiyan
    Song, Haiyu
    Deng, Shengchun
    Chen, Yanhong
    APPLIED SOFT COMPUTING, 2022, 128
  • [48] SPHINX: A System for Metapath-based Entity Exploration in Heterogeneous Information Networks
    Chatzopoulos S.
    Patroumpas K.
    Zeakis A.
    Vergoulis T.
    Skoutas D.
    1600, VLDB Endowment (13): : 2913 - 2916
  • [49] Metapath and attribute-based academic collaborator recommendation in heterogeneous academic networks
    Li, Hui
    Hu, Yaohua
    SCIENTOMETRICS, 2024, 129 (07) : 4295 - 4315