Self-Attention Graph Residual Convolutional Network for Event Detection with dependency relations

被引:0
|
作者
Liu, Haozhe [1 ]
Xu, Ning [1 ]
Liu, Anan [1 ]
机构
[1] Tianjin Univ, Schoo Elect & Informat Engn, Tianjin, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Event detection (ED) task aims to classify events by identifying key event trigger words embedded in a piece of text. Previous research have proved the validity of fusing syntactic dependency relations into Graph Convolutional Networks(GCN). While existing GCN-based methods explore latent node-to-node dependency relations according to a stationary adjacency tensor, an attention-based dynamic tensor, which can pay much attention to the key node like event trigger or its neighboring nodes, has not been developed. Simultaneously, suffering from the phenomenon of graph information vanishing caused by the symmetric adjacency tensor, existing GCN models can not achieve higher overall performance. In this paper, we propose a novel model Self-Attention Graph Residual Convolution Networks (SAGRCN) to mine node-to-node latent dependency relations via self-attention mechanism and introduce Graph Residual Network (GResNet) to solve graph information vanishing problem. Specifically, a self-attention module is constructed to generate an attention tensor, representing the dependency attention scores of all words in the sentence. Furthermore, a graph residual term is added to the baseline SA-GCN to construct a GResNet. Considering the syntactically connection of the network input, we initialize the raw adjacency tensor without processed by the self-attention module as the residual term. We conduct experiments on the ACE2005 dataset and the results show significant improvement over competitive baseline methods.
引用
收藏
页码:302 / 311
页数:10
相关论文
共 50 条
  • [31] Image Classification based on Self-attention Convolutional Neural Network
    Cai, Xiaohong
    Li, Ming
    Cao, Hui
    Ma, Jingang
    Wang, Xiaoyan
    Zhuang, Xuqiang
    [J]. SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913
  • [32] MSASGCN : Multi-Head Self-Attention Spatiotemporal Graph Convolutional Network for Traffic Flow Forecasting
    Cao, Yang
    Liu, Detian
    Yin, Qizheng
    Xue, Fei
    Tang, Hengliang
    [J]. JOURNAL OF ADVANCED TRANSPORTATION, 2022, 2022
  • [33] Few-Shot Relation Prediction of Knowledge Graph via Convolutional Neural Network with Self-Attention
    Zhong, Shanna
    Wang, Jiahui
    Yue, Kun
    Duan, Liang
    Sun, Zhengbao
    Fang, Yan
    [J]. DATA SCIENCE AND ENGINEERING, 2023, 8 (04) : 385 - 395
  • [34] Self-Attention Graph Pooling
    Lee, Junhyun
    Lee, Inyeop
    Kang, Jaewoo
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [35] A Convolutional Self-Attention Network for CSI Reconstruction in MIMO System
    Liu Q.
    Sun J.
    Qiu S.
    Lv Y.
    Du X.
    [J]. Wireless Communications and Mobile Computing, 2023, 2023
  • [36] Residual attention graph convolutional network for web services classification
    Li, Bing
    Li, Zhi
    Yang, Yilong
    [J]. NEUROCOMPUTING, 2021, 440 : 45 - 57
  • [37] Residual convolutional graph neural network with subgraph attention pooling
    Duan, Yutai
    Wang, Jianming
    Ma, Haoran
    Sun, Yukuan
    [J]. TSINGHUA SCIENCE AND TECHNOLOGY, 2022, 27 (04) : 653 - 663
  • [38] Residual Convolutional Graph Neural Network with Subgraph Attention Pooling
    Yutai Duan
    Jianming Wang
    Haoran Ma
    Yukuan Sun
    [J]. Tsinghua Science and Technology, 2022, 27 (04) : 653 - 663
  • [40] Leveraging Knowledge Graph and Self-Attention with Residual Block for Paper Recommendation
    Pang, Xinyue
    Nuo, Minghua
    Cao, Jiamin
    [J]. 2021 INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE BIG DATA AND INTELLIGENT SYSTEMS (HPBD&IS), 2021, : 196 - 201