QRNN-Transformer: Recognizing Textual Entailment

被引:0
|
作者
Zhu, Xiaogang [1 ]
Yan, Zhihan [2 ]
Wang, Wenzhan [3 ]
Hu, Shu [4 ]
Wang, Xin [5 ]
Liu, Chunnian [1 ]
机构
[1] Nanchang Univ, Sch Publ Policy & Adm, Nanchang, Jiangxi, Peoples R China
[2] Nanchang Univ, Sch Software, Nanchang, Jiangxi, Peoples R China
[3] Jingdezhen Univ, Jingdezhen, Peoples R China
[4] Purdue Univ, Dept Comp & Informat Technol, W Lafayette, IN USA
[5] SUNY Albany, Sch Publ Hlth, Albany, NY USA
关键词
D O I
10.1109/AVSS61716.2024.10672593
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, the Transformer model based on the self-attention mechanism has made significant progress in natural language processing and has also been applied in the text implication recognition task, achieving excellent results. However, the Transformer model still has deficiencies in modeling local information in the text. To improve the Transformer model, the QRNN-Transformer was proposed, which uses the QRNN network to divide the input text sequence into local short sequences to capture the local information of the input text. The self-attention is improved by combining the gating mechanism to make the model select tasks-related words or features. Extensive experiments have demonstrated the QRNN-Transformer model can effectively improve the accuracy of entailment relationship recognition on both English and Chinese datasets.(1)
引用
收藏
页数:7
相关论文
共 50 条
  • [1] AORTE for Recognizing Textual Entailment
    Siblini, Reda
    Kosseim, Leila
    COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, 2009, 5449 : 245 - 255
  • [2] Recognizing Textual Entailment with Statistical Methods
    Gaona, Miguel Angel Rios
    Gelbukh, Alexander
    Bandyopadhyay, Sivaji
    ADVANCES IN PATTERN RECOGNITION, 2010, 6256 : 372 - +
  • [3] Figurative Language in Recognizing Textual Entailment
    Chakrabarty, Tuhin
    Ghosh, Debanjan
    Poliak, Adam
    Muresan, Smaranda
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3354 - 3361
  • [4] Paraphrase substitution for recognizing textual entailment
    Bosma, Wauter
    Callison-Burch, Chris
    EVALUATION OF MULTILINGUAL AND MULTI-MODAL INFORMATION RETRIEVAL, 2007, 4730 : 502 - +
  • [5] Recognizing Textual Entailment: Models and Applications
    Dagan, Ido
    Roth, Dan
    Sammons, Mark
    Zanzotto, Fabio
    Synthesis Lectures on Human Language Technologies, 2013, 6 (04): : 1 - 222
  • [6] Recognizing Textual Entailment: Models and Applications
    Grabar, Natalia
    TRAITEMENT AUTOMATIQUE DES LANGUES, 2013, 54 (02): : 142 - 146
  • [7] Recognizing Textual Entailment based on WordNet
    Feng, Jin
    Zhou, Yiming
    Martin, Trevor
    2008 INTERNATIONAL SYMPOSIUM ON INTELLIGENT INFORMATION TECHNOLOGY APPLICATION, VOL II, PROCEEDINGS, 2008, : 27 - +
  • [8] Recognizing Textual Entailment: Models and Applications
    Magnini, Bernardo
    COMPUTATIONAL LINGUISTICS, 2015, 41 (01) : 157 - 160
  • [9] Recognizing Textual Entailment and Paraphrases in Portuguese
    Rocha, Gil
    Cardoso, Henrique Lopes
    PROGRESS IN ARTIFICIAL INTELLIGENCE (EPIA 2017), 2017, 10423 : 868 - 879
  • [10] Recognizing textual entailment via atomic propositions
    Akhmatova, Elena
    Molla, Diego
    MACHINE LEARNING CHALLENGES: EVALUATING PREDICTIVE UNCERTAINTY VISUAL OBJECT CLASSIFICATION AND RECOGNIZING TEXTUAL ENTAILMENT, 2006, 3944 : 385 - 403