A comprehensive exploration of semantic relation extraction via pre-trained CNNs

被引:32
|
作者
Li, Qing [1 ]
Li, Lili [2 ]
Wang, Weinan [3 ]
Li, Qi [4 ]
Zhong, Jiang [1 ,5 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing, Peoples R China
[2] Chongqing Univ, Sch Civil Engn, Chongqing, Peoples R China
[3] Peking Univ, Sch Math Sci, Beijing, Peoples R China
[4] Shaoxing Univ, Dept Comp Sci & Engn, Shaoxing, Peoples R China
[5] Chongqing Univ, Key Lab Dependable Serv Comp Cyber Phys Soc, Chongqing, Peoples R China
关键词
Relation extraction; Semantic relation; Natural language processing; Convolutional neural networks;
D O I
10.1016/j.knosys.2020.105488
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semantic relation extraction between entity pairs is a crucial task in information extraction from text. In this paper, we propose a new pre-trained network architecture for this task, and it is called the XM-CNN. The XM-CNN utilizes word embedding and position embedding information. It is designed to reinforce the contextual output from the MT-DNNKD pre-trained model. Our model effectively utilized an entity-aware attention mechanisms to detected the features and also adopts and applies more relation-specific pooling attention mechanisms applied to it. The experimental results show that the XM-CNN achieves state-of-the-art results on the SemEval-2010 task 8, and a thorough evaluation of the method is conducted. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] Protocol for the automatic extraction of epidemiological information via a pre-trained language model
    Wang, Zhizheng
    Liu, Xiao Fan
    Du, Zhanwei
    Wang, Lin
    Wu, Ye
    Holme, Petter
    Lachmann, Michael
    Lin, Hongfei
    Wang, Zhuoyue
    Cao, Yu
    Wong, Zoie S. Y.
    Xu, Xiao-Ke
    Sun, Yuanyuan
    STAR PROTOCOLS, 2023, 4 (03):
  • [22] Biomedical event extraction using pre-trained SciBERT
    Mulya, Dimmas
    Khodra, Masayu Leylia
    JOURNAL OF INTELLIGENT SYSTEMS, 2023, 32 (01)
  • [23] MaskDiffusion: Exploiting Pre-Trained Diffusion Models for Semantic Segmentation
    Kawano, Yasufumi
    Aoki, Yoshimitsu
    IEEE ACCESS, 2024, 12 : 127283 - 127293
  • [24] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
    Alt, Christoph
    Huebner, Marc
    Hennig, Leonhard
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398
  • [25] Enhancing Skin Diseases Classification Through Dual Ensemble Learning and Pre-trained CNNs
    El Gannour, Oussama
    Hamida, Soufiane
    Lamalem, Yasser
    Cherradi, Bouchaib
    Saleh, Shawki
    Raihani, Abdelhadi
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (06) : 436 - 445
  • [26] Efficient fusion of handcrafted and pre-trained CNNs features to classify melanoma skin cancer
    Filali, Youssef
    EL Khoukhi, Hasnae
    Sabri, My Abdelouahed
    Aarab, Abdellah
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (41-42) : 31219 - 31238
  • [27] Efficient fusion of handcrafted and pre-trained CNNs features to classify melanoma skin cancer
    Youssef Filali
    Hasnae EL Khoukhi
    My Abdelouahed Sabri
    Abdellah Aarab
    Multimedia Tools and Applications, 2020, 79 : 31219 - 31238
  • [28] Tuning Pre-trained Model via Moment Probing
    Gao, Mingze
    Wang, Qilong
    Lin, Zhenyi
    Zhu, Pengfei
    Hu, Qinghua
    Zhou, Jingbo
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11769 - 11779
  • [29] ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning
    Qin, Yujia
    Lin, Yankai
    Takanobu, Ryuichi
    Liu, Zhiyuan
    Li, Peng
    Ji, Heng
    Huang, Minlie
    Sun, Maosong
    Zhou, Jie
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3350 - 3363
  • [30] Exploring Pre-trained Language Models for Event Extraction and Generation
    Yang, Sen
    Feng, Dawei
    Qiao, Linbo
    Kan, Zhigang
    Li, Dongsheng
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5284 - 5294