Edge-featured graph attention network with dependency features for causality detection of events

被引:1
|
作者
Wei, Jianxiang [1 ]
Chen, Yuhang [2 ]
Han, Pu [1 ]
Zhu, Yunxia [2 ]
Huang, Weidong [1 ,3 ,4 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Management, Nanjing, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Sch Internet Things, Nanjing, Peoples R China
[3] Nanjing Univ Posts & Telecommun, Key Res Base Philosophy & Social Sci Jiangsu Infor, Nanjing, Peoples R China
[4] Nanjing Univ Posts & Telecommun, Emergency Management Res Ctr, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
causality detection; dependency-directed graphs; Edge-featured Graph Attention Network; IDENTIFICATION;
D O I
10.1111/exsy.13332
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Causality detection, as a more fine-grained task than causality extraction, which aims to detect the components that represent the cause and effect in sentence-level texts with causality, is a significant task in the field of Natural Language Processing (NLP). Previous research on causality detection has concentrated on text token features whilst ignoring the dependency attributes between tokens. In this paper, we propose a model that uses the Edge-featured Graph Attention Network based on dependency-directed graphs for the causality detection task. To begin, we convert the texts with causality into the representation of dependency-directed graphs (DDG), which regard the dependency attributes between tokens as edge features. Then we use Edge-featured Graph Attention Network to aggregate the node and edge features of DDG. Finally, we put the graph embedding into Bi-directional Long Short-Term Memory (BiLSTM) layer to learn the dependencies between forward and backward long-distance nodes in DDG. Experiments on three datasets prove that this method achieves better performance in precision, recall, and other evaluation metrics compared with other methods.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] EGAT: Edge-Featured Graph Attention Network
    Wang, Ziming
    Chen, Jun
    Chen, Haopeng
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 253 - 264
  • [2] Edge-featured multi-hop attention graph neural network for intrusion detection system
    Deng, Ping
    Huang, Yong
    COMPUTERS & SECURITY, 2025, 148
  • [3] Dual-Channel Edge-Featured Graph Attention Networks for Aspect-Based Sentiment Analysis
    Lu, Junwen
    Shi, Lihui
    Liu, Guanfeng
    Zhan, Xinrong
    ELECTRONICS, 2023, 12 (03)
  • [4] Edge Features Enhanced Graph Attention Network for Relation Extraction
    Bai, Xuefeng
    Feng, Chong
    Zhang, Huanhuan
    Wang, Xiaomei
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 121 - 133
  • [5] Graph attention network based detection of causality for textual emotion-cause pair
    Cao, Qian
    Hao, Xiulan
    Ren, Huajian
    Xu, Wenjing
    Xu, Shiluo
    Asiedu, Charles Jnr, Jr.
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (04): : 1731 - 1745
  • [6] Graph attention network with Granger causality map for fault detection and root cause diagnosis
    Liu, Yingxiang
    Jafarpour, Behnam
    COMPUTERS & CHEMICAL ENGINEERING, 2024, 180
  • [7] Graph attention network based detection of causality for textual emotion-cause pair
    Qian Cao
    Xiulan Hao
    Huajian Ren
    Wenjing Xu
    Shiluo Xu
    Charles Jnr. Asiedu
    World Wide Web, 2023, 26 : 1731 - 1745
  • [8] Research on Rumor Detection Based on a Graph Attention Network With Temporal Features
    Yang, Xiaohui
    Ma, Hailong
    Wang, Miao
    INTERNATIONAL JOURNAL OF DATA WAREHOUSING AND MINING, 2023, 19 (02)
  • [9] Self-Attention Graph Residual Convolutional Network for Event Detection with dependency relations
    Liu, Haozhe
    Xu, Ning
    Liu, Anan
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 302 - 311
  • [10] Latent Graph Induction Networks and Dependency Graph Networks for Events Detection
    Yang, Jing
    Gao, Hu
    Dang, Depeng
    IEEE ACCESS, 2025, 13 : 10713 - 10723