Generative Event Extraction via Internal Knowledge-Enhanced Prompt Learning

被引:1
|
作者
Song, Hetian [1 ,2 ]
Zhu, Qingmeng [2 ]
Yu, Zhipeng [2 ]
Liang, Jian [2 ]
He, Hao [2 ]
机构
[1] Univ Chinese Acad Sci, Beijing, Peoples R China
[2] Chinese Acad Sci, Inst Software, Beijing, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Event Extraction; Generative; Prompt; Knowledge;
D O I
10.1007/978-3-031-44192-9_8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Event extraction is a crucial research task in information extraction. In order to maximize the performances of the pre-trained language model (PLM), some works formulating event extraction as a conditional generation problem. However, most existing generative methods ignore the prior information between event entities, and are usually over-dependent on hand-crafted designed templates, which causing subjective intervention. In this paper, we propose a generative event extraction model named KEPGEE based on internal knowledge-enhanced prompt learning. We firstly use relational graph neural networks (RGCN) to encode the event triples entities and fuse them with the word embeddings to obtain the knowledge representation. Then the knowledge representation is concatenated with task-specific virtual tokens to compose knowledge-enhanced soft prompts, which can provide additional event information to adapt the sequence-to-sequence PLM for the generative event extraction task. Besides, in template design, we add the related topic words into the prompt templates to enhance the implicit event information. We evaluate our model on ACE2005 and ERE datasets, and the results show that our model achieves matched or better performances with several classification-based or generation-based event extraction models (including the state-of-the-art models).
引用
收藏
页码:90 / 102
页数:13
相关论文
共 50 条
  • [1] Knowledge-enhanced event relation extraction via event ontology prompt
    Zhuang, Ling
    Fei, Hao
    Hu, Po
    [J]. INFORMATION FUSION, 2023, 100
  • [2] Towards Unified Conversational Recommender Systems via Knowledge-Enhanced Prompt Learning
    Wang, Xiaolei
    Zhou, Kun
    Wen, Ji-Rong
    Zhao, Wayne Xin
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1929 - 1937
  • [3] Knowledge-Enhanced Prompt Learning for Few-Shot Text Classification
    Liu, Jinshuo
    Yang, Lu
    [J]. BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (04)
  • [4] Knowledge-enhanced Prompt Learning for Open-domain Commonsense Reasoning
    Zhao, Xujiang
    Liu, Yanchi
    Cheng, Wei
    Oishi, Mika
    Osaki, Takao
    Matsuda, Katsushi
    Chen, Haifeng
    [J]. NEC Technical Journal, 2024, 17 (02): : 91 - 95
  • [5] Biomedical relation extraction via knowledge-enhanced reading comprehension
    Jing Chen
    Baotian Hu
    Weihua Peng
    Qingcai Chen
    Buzhou Tang
    [J]. BMC Bioinformatics, 23
  • [6] Biomedical relation extraction via knowledge-enhanced reading comprehension
    Chen, Jing
    Hu, Baotian
    Peng, Weihua
    Chen, Qingcai
    Tang, Buzhou
    [J]. BMC BIOINFORMATICS, 2022, 23 (01)
  • [7] Knowledge-enhanced Prompt-tuning for Stance Detection
    Huang, Hu
    Zhang, Bowen
    Li, Yangyang
    Zhang, Baoquan
    Sun, Yuxi
    Luo, Chuyao
    Peng, Cheng
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (06)
  • [8] A Knowledge-enhanced Two-stage Generative Framework for Medical Dialogue Information Extraction
    Hu, Zefa
    Ni, Ziyi
    Shi, Jing
    Xu, Shuang
    Xu, Bo
    [J]. MACHINE INTELLIGENCE RESEARCH, 2024, 21 (01) : 153 - 168
  • [9] A Knowledge-enhanced Two-stage Generative Framework for Medical Dialogue Information Extraction
    Zefa Hu
    Ziyi Ni
    Jing Shi
    Shuang Xu
    Bo Xu
    [J]. Machine Intelligence Research, 2024, 21 : 153 - 168
  • [10] Knowledge-Enhanced Learning for KG Embedding
    Zhang, Haodi
    Chen, Zhao
    Nie, Jinyin
    Jiang, Di
    Fan, Lixin
    Wu, Kaishun
    [J]. 2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 843 - 850