An Idiom Reading Comprehension Model Based on Multi-Granularity Reasoning and Paraphrase Expansion

被引:1
|
作者
Dai, Yu [1 ]
Liu, Yuqiao [1 ]
Yang, Lei [2 ]
Fu, Yufan [1 ]
机构
[1] Northeastern Univ, Software Coll, Shenyang 110819, Peoples R China
[2] Northeastern Univ, Coll Comp Sci & Engn, Key Lab Intelligent Comp Med Image, Minist Educ, Shenyang 110169, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 09期
关键词
machine reading; comprehension; pre-trained model; cloze-style reading comprehension; Chinese idioms; BERT;
D O I
10.3390/app13095777
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Idioms are a unique class of words in the Chinese language that can be challenging for Chinese machine reading comprehension due to their formal simplicity and the potential mismatch between their literal and figurative meanings. To address this issue, this paper adopted the "2 + 2" structure as the representation model for idiom structure feature extraction. According to the linguistic theory of idioms, to enhance the model's learning ability for idiom semantics, we propose a two-stage semantic expansion method that leverages semantic knowledge during the pre-training stage and extracts idiom interpretation information during the fine-tuning stage to improve the model's understanding of idioms. Moreover, with the consideration of the inferential interaction between global and local information during context fusion, which is neglected by all current works, we propose a method that utilizes a multi-headed attention mechanism to fully extract global and local information by selecting three attention patterns in the fine-tuning stage. This enables the fusion of semantic information extracted by the model at different granularities, leading to improved accuracy. The experimental results demonstrated that our proposed BERT-IDM model outperformed the baseline BERT model by achieving a 4.1% accuracy improvement.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Multi-Granularity Based Feature Interaction Pruning Model for CTR Prediction
    Bai T.
    Liu X.
    Wu B.
    Zhang Z.
    Xu Z.
    Lin K.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (05): : 1290 - 1298
  • [22] Multi-granularity switching structure based on lambda-group model
    Wang, YY
    Zeng, QJ
    Jiang, C
    Xiao, SL
    Lu, LH
    ETRI JOURNAL, 2006, 28 (01) : 119 - 122
  • [23] Image classification based on multi-granularity convolutional Neural network model
    Wu, Xiaogang
    Tanprasert, Thitipong
    Jing, Wang
    2022 19TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING (JCSSE 2022), 2022,
  • [24] A multi-granularity grid-based graph model for indoor space
    1600, Science and Engineering Research Support Society (09):
  • [25] A multi-granularity knowledge association model of geological text based on hypernetwork
    Zhuang, Can
    Li, Wenjia
    Xie, Zhong
    Wu, Liang
    EARTH SCIENCE INFORMATICS, 2021, 14 (01) : 227 - 246
  • [26] Multi-granularity Histories Merging Network for Temporal Knowledge Graph Reasoning
    Liu, Shihao
    Zhou, Xiaofei
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2024, 14874 LNCS : 158 - 169
  • [27] A multi-granularity knowledge association model of geological text based on hypernetwork
    Can Zhuang
    Wenjia Li
    Zhong Xie
    Liang Wu
    Earth Science Informatics, 2021, 14 : 227 - 246
  • [28] Chinese Sentence Semantic Matching Based on Multi-Granularity Fusion Model
    Zhang, Xu
    Lu, Wenpeng
    Zhang, Guoqiang
    Li, Fangfang
    Wang, Shoujin
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2020, PT II, 2020, 12085 : 246 - 257
  • [29] MGMSN: Multi-Granularity Matching Model Based on Siamese Neural Network
    Wang, Xin
    Yang, Huimin
    FRONTIERS IN BIOENGINEERING AND BIOTECHNOLOGY, 2022, 10
  • [30] Story Generation Based on Multi-granularity Constraints
    Guo, Zhenpeng
    Wan, Jiaqiang
    Tang, Hongan
    Lu, Yunhua
    ARTIFICIAL INTELLIGENCE, CICAI 2022, PT III, 2022, 13606 : 223 - 235