Information Extraction Network Based on Multi-Granularity Attention and Multi-Scale Self-Learning

被引:1
|
作者
Sun, Weiwei [1 ,2 ]
Liu, Shengquan [1 ,2 ]
Liu, Yan [1 ,2 ]
Kong, Lingqi [1 ,2 ]
Jian, Zhaorui [1 ,2 ]
机构
[1] Xinjiang Univ, Coll Informat Sci & Engn, Urumqi 830046, Peoples R China
[2] Xinjiang Univ, Coll Informat Sci & Engn, Xinjiang Multilingual Informat Technol Lab, Urumqi 830046, Peoples R China
基金
中国国家自然科学基金;
关键词
nested named entity identification; entity relationship extraction; machine reading comprehension; multi-grained attention mechanism; multi-scale self-learning mechanism;
D O I
10.3390/s23094250
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Transforming the task of information extraction into a machine reading comprehension (MRC) framework has shown promising results. The MRC model takes the context and query as the inputs to the encoder, and the decoder extracts one or more text spans as answers (entities and relationships) from the text. Existing approaches typically use multi-layer encoders, such as Transformers, to generate hidden features of the source sequence. However, increasing the number of encoder layers can lead to the granularity of the representation becoming coarser and the hidden features of different words becoming more similar, potentially leading to the model's misjudgment. To address this issue, a new method called the multi-granularity attention multi-scale self-learning network (MAML-NET) is proposed, which enhances the model's understanding ability by utilizing different granularity representations of the source sequence. Additionally, MAML-NET can independently learn task-related information from both global and local dimensions based on the learned multi-granularity features through the proposed multi-scale self-learning attention mechanism. The experimental results on two information extraction tasks, named entity recognition and entity relationship extraction, demonstrated that the method was superior to the method based on machine reading comprehension and achieved the best performance on the five benchmark tests.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Multi-scale network via progressive multi-granularity attention for fine-grained visual classification
    An, Chen
    Wang, Xiaodong
    Wei, Zhiqiang
    Zhang, Ke
    Huang, Lei
    [J]. APPLIED SOFT COMPUTING, 2023, 146
  • [2] Feature fusion of multi-granularity and multi-scale for facial expression recognition
    Xia, Haiying
    Lu, Lidan
    Song, Shuxiang
    [J]. VISUAL COMPUTER, 2024, 40 (03): : 2035 - 2047
  • [3] MINING: Multi-Granularity Network Alignment Based on Contrastive Learning
    Zhang, Zhongbao
    Gao, Shuai
    Su, Sen
    Sun, Li
    Chen, Ruiyang
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (12) : 12785 - 12798
  • [4] Feature fusion of multi-granularity and multi-scale for facial expression recognition
    Haiying Xia
    Lidan Lu
    Shuxiang Song
    [J]. The Visual Computer, 2024, 40 : 2035 - 2047
  • [5] Multi-granularity Complex Network Representation Learning
    Li, Peisen
    Wang, Guoyin
    Hu, Jun
    Li, Yun
    [J]. ROUGH SETS, IJCRS 2020, 2020, 12179 : 236 - 250
  • [6] MGSAN: A Multi-granularity Self-attention Network for Next POI Recommendation
    Li, Yepeng
    Xian, Xuefeng
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    [J]. WEB INFORMATION SYSTEMS ENGINEERING - WISE 2021, PT II, 2021, 13081 : 193 - 208
  • [7] Multi-granularity Network Representation Learning Based on Game Theory
    Shu, Hang
    Liu, Qun
    Xia, Shuyin
    [J]. 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 454 - 461
  • [8] A Multi-Granularity Backbone Network Extraction Method Based on the Topology Potential
    Yuan, Hanning
    Han, Yanni
    Cai, Ning
    An, Wei
    [J]. COMPLEXITY, 2018,
  • [9] MgMViT: Multi-Granularity and Multi-Scale Vision Transformer for Efficient Action Recognition
    Huo, Hua
    Li, Bingjie
    [J]. ELECTRONICS, 2024, 13 (05)
  • [10] MGRL: attributed multiplex heterogeneous network representation learning based on multi-granularity information fusion
    Ke Chen
    Guoyin Wang
    Shun Fu
    Jun Hu
    Li Liu
    [J]. International Journal of Machine Learning and Cybernetics, 2022, 13 : 1891 - 1906