Multi-Turn and Multi-Granularity Reader for Document-Level Event Extraction

被引:3
|
作者
Yang, Hang [1 ]
Chen, Yubo [1 ]
Liu, Kang [1 ]
Zhao, Jun [1 ]
Zhao, Zuyu [2 ]
Sun, Weijian [2 ]
机构
[1] Univ Chinese Acad Sci, Sch Artificial Intelligence, 95 Zhongguancun East Rd, Beijing 100190, Peoples R China
[2] Huawei Technol Co Ltd, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Document-level event extraction; machine reading comprehension; multigranularity; reader;
D O I
10.1145/3542925
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most existing event extraction works mainly focus on extracting events from one sentence. However, in real-world applications, arguments of one event may scatter across sentences and multiple events may cooccur in one document. Thus, these scenarios require document-level event extraction (DEE), which aims to extract events and their arguments across sentences from a document. Previous works cast DEE as a two-step paradigm: sentence-level event extraction (SEE) to document-level event fusion. However, this paradigm lacks integrating document-level information for SEE and suffers fromthe inherent limitations of error propagation. In this article, we propose a multi-turn and multi-granularity reader for DEE that can extract events from the document directly without the stage of preliminary SEE. Specifically, we propose a new paradigm of DEE by formulating it as a machine span from the document). Beyond the framework of machine reading comprehension, we introduce amulti-turn andmulti-granularity reader to capture the dependencies between arguments explicitly and model long texts effectively. The empirical results demonstrate that our method achieves superior performance on the MUC-4 and the ChFinAnn datasets.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Few-Shot Document-Level Event Argument Extraction
    Yang, Xianjun
    Lu, Yujie
    Petzold, Linda
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 8029 - 8046
  • [32] Document-Level Event Argument Extraction with Sparse Representation Attention
    Zhang, Mengxi
    Chen, Honghui
    [J]. MATHEMATICS, 2024, 12 (17)
  • [33] TDJEE: A Document-Level Joint Model for Financial Event Extraction
    Wang, Peng
    Deng, Zhenkai
    Cui, Ruilong
    [J]. ELECTRONICS, 2021, 10 (07)
  • [34] Role Knowledge Prompting for Document-Level Event Argument Extraction
    Hu, Ruijuan
    Liu, Haiyan
    Zhou, Huijuan
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (05):
  • [35] Multi-Level Curriculum Learning for Multi-Turn Dialogue Generation
    Chen, Guanhua
    Zhan, Runzhe
    Wong, Derek F.
    Chao, Lidia S.
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 3958 - 3967
  • [36] Multi-granularity for knowledge distillation
    Shao, Baitan
    Chen, Ying
    [J]. IMAGE AND VISION COMPUTING, 2021, 115 (115)
  • [37] Multi-granularity resource Reservations
    Saewong, S
    Rajkumar, R
    [J]. RTSS 2005: 26th IEEE International Real-Time Systems Symposium, Proceedings, 2005, : 143 - 153
  • [38] Multi-granularity Attribute Reduction
    Liang, Shaochen
    Liu, Keyu
    Chen, Xiangjian
    Wang, Pingxin
    Yang, Xibei
    [J]. ROUGH SETS, IJCRS 2018, 2018, 11103 : 61 - 72
  • [39] Multi-granularity Fatigue in Recommendation
    Xie, Ruobing
    Ling, Cheng
    Zhang, Shaoliang
    Xia, Feng
    Lin, Leyu
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4595 - 4599
  • [40] Multi-granularity heterogeneous graph attention networks for extractive document summarization
    Zhao, Yu
    Wang, Leilei
    Wang, Cui
    Du, Huaming
    Wei, Shaopeng
    Feng, Huali
    Yu, Zongjian
    Li, Qing
    [J]. NEURAL NETWORKS, 2022, 155 : 340 - 347