An entity and relation extraction model based on context query and axial attention towards patent texts

被引:0
|
作者
Wang, Tengke [1 ,2 ]
Zhao, Yushan [2 ,3 ]
Zhu, Guangli [1 ,2 ]
Liu, Yunduo [1 ,2 ]
Li, Hanchen [1 ,2 ]
Zhang, Shunxiang [1 ,2 ]
Hsieh, Mengyen [4 ]
机构
[1] Anhui Univ Sci & Technol, Sch Comp Sci & Engn, Huainan, Peoples R China
[2] Hefei Comprehens Natl Sci Ctr, Inst Artificial Intelligence, Hefei, Peoples R China
[3] Anhui Univ Sci & Technol, Sch Math & Big Data, Huainan, Peoples R China
[4] Providence Univ, Dept Comp Sci & Informat Engn, Taichung, Taiwan
基金
中国国家自然科学基金;
关键词
Axial attention; context query; entity-relation triples; multi-head attention; patent entity and relation extraction;
D O I
10.1080/09540091.2024.2426816
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Patent Entity and Relation Extraction (PERE) aims to extract entities and entity-relation triples from unstructured patent texts. PERE is one of the fundamental tasks in patent text mining, providing crucial technical support for patent retrieval and technology opportunity discovery. Previous works struggle to capture the implicit semantic information hidden within overlapping triples, especially a large number of overlapping triples existing in patent texts. A Patent Entity and Relation Extraction model based on Context query and Axial attention is proposed, named PERE-CA. As for entity recognition, the text segment is regarded as candidate entity span and entity types are acquired by span classification. Subsequently, the semantic context related to an entity pair is calculated by a context query method. And the semantic context is integrated into entity pair representation. For relation extraction, axial attention is implemented to get the implicit semantic information among overlapping entity pairs. And then, the model outputs all valid entity-relation triples. Experimental results on the patent dataset TFH-2020 and the public dataset SciERC demonstrate that the implementation of context query and axial attention can effectively improve extraction performance.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] Towards Structuring Clinical Texts: Joint Entity and Relation Extraction from Japanese Case Report Corpus
    Shibata, Daisaku
    Shinohara, Emiko
    Shimamoto, Kiminori
    Kawazoe, Yoshimasa
    MEDINFO 2023 - THE FUTURE IS ACCESSIBLE, 2024, 310 : 559 - 563
  • [22] Traditional Chinese medicine entity relation extraction based on CNN with segment attention
    Tian Bai
    Haotian Guan
    Shang Wang
    Ye Wang
    Lan Huang
    Neural Computing and Applications, 2022, 34 : 2739 - 2748
  • [23] Traditional Chinese medicine entity relation extraction based on CNN with segment attention
    Bai, Tian
    Guan, Haotian
    Wang, Shang
    Wang, Ye
    Huang, Lan
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (04): : 2739 - 2748
  • [24] A Relation-Specific Attention Network for Joint Entity and Relation Extraction
    Yuan, Yue
    Zhou, Xiaofei
    Pan, Shirui
    Zhu, Qiannan
    Song, Zeliang
    Guo, Li
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 4054 - 4060
  • [25] A Walk-based Model on Entity Graphs for Relation Extraction
    Christopoulou, Fenia
    Miwa, Makoto
    Ananiadou, Sophia
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 81 - 88
  • [26] Joint entity and relation extraction model based on rich semantics
    Geng, Zhiqiang
    Zhang, Yanhui
    Han, Yongming
    NEUROCOMPUTING, 2021, 429 : 132 - 140
  • [27] Dual Interactive Attention Network for Joint Entity and Relation Extraction
    Li, Lishuang
    Wang, Zehao
    Qin, Xueyang
    Lu, Hongbin
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 259 - 271
  • [28] Distant Supervision for Relation Extraction with Hierarchical Attention and Entity Descriptions
    She, Heng
    Wu, Bin
    Wang, Bai
    Chi, Renjun
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [29] SUBSEQUENCE-LEVEL ENTITY ATTENTION LSTM FOR RELATION EXTRACTION
    Gan, Tao
    Gan, Yunqiang
    He, Yanmin
    2019 16TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICWAMTIP), 2019, : 262 - 265
  • [30] Hierarchical Attention CNN and Entity-Aware for Relation Extraction
    Zhu, Xinyu
    Liu, Gongshen
    Su, Bo
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT IV, 2019, 1142 : 87 - 94