Deep Learning-based Sentence Embeddings using BERT for Textual Entailment

被引:0
|
作者
Alsuhaibani, Mohammed [1 ]
机构
[1] Qassim Univ, Coll Comp, Dept Comp Sci, Buraydah 52571, Saudi Arabia
关键词
Textual entailment; deep learning; entailment detec-tion; BERT; text processing; natural language processing systems;
D O I
10.14569/IJACSA.2023.01408108
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This study directly and thoroughly investigates the practicalities of utilizing sentence embeddings, derived from the foundations of deep learning, for textual entailment recognition, with a specific emphasis on the robust BERT model. As a cornerstone of our research, we incorporated the Stanford Natural Language Inference (SNLI) dataset. Our study emphasizes a meticulous analysis of BERT's variable layers to ascertain the optimal layer for generating sentence embeddings that can effectively identify entailment. Our approach deviates from traditional methodologies, as we base our evaluation of entailment on the direct and simple comparison of sentence norms, subsequently highlighting the geometrical attributes of the embeddings. Experimental results revealed that the L2 norm of sentence embeddings, drawn specifically from BERT's 7th layer, emerged superior in entailment detection compared to other setups.
引用
收藏
页码:997 / 1004
页数:8
相关论文
共 50 条
  • [31] Machine Learning-Based Model Categorization Using Textual and Structural Features
    Khalilipour, Alireza
    Bozyigit, Fatma
    Utku, Can
    Challenger, Moharram
    NEW TRENDS IN DATABASE AND INFORMATION SYSTEMS, ADBIS 2022, 2022, 1652 : 425 - 436
  • [32] Learning Word and Sentence Embeddings Using a Generative Convolutional Network
    Vargas-Ocampo, Edgar
    Roman-Rangel, Edgar
    Hermosillo-Valadez, Jorge
    PATTERN RECOGNITION, 2018, 10880 : 135 - 144
  • [33] DialogueCSE: Dialogue-based Contrastive Learning of Sentence Embeddings
    Liu, Che
    Wang, Rui
    Liu, Jinghua
    Sun, Jian
    Huang, Fei
    Si, Luo
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2396 - 2406
  • [34] DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
    Chuang, Yung-Sung
    Dangovski, Rumen
    Luo, Hongyin
    Zhang, Yang
    Chang, Shiyu
    Soljacic, Marin
    Li, Shang-Wen
    Yih, Wen-tau
    Kim, Yoon
    Glass, James
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4207 - 4218
  • [35] WhitenedCSE: Whitening-based Contrastive Learning of Sentence Embeddings
    Zhuo, Wenjie
    Sun, Yifan
    Wang, Xiaohan
    Zhu, Linchao
    Yang, Yi
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 12135 - 12148
  • [36] Textual Entailment in Legal Bar Exam Question Answering Using Deep Siamese Networks
    Kim, Mi-Young
    Lu, Yao
    Goebel, Randy
    NEW FRONTIERS IN ARTIFICIAL INTELLIGENCE (JSAI-ISAI 2017), 2018, 10838 : 35 - 48
  • [37] A Novel Approach Towards Fake News Detection: Deep Learning Augmented with Textual Entailment Features
    Saikh, Tanik
    Anand, Amit
    Ekbal, Asif
    Bhattacharyya, Pushpak
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2019), 2019, 11608 : 345 - 358
  • [38] Simple Flow-Based Contrastive Learning for BERT Sentence Representations
    Tian, Ziyi
    Liu, Qun
    Liu, Maotao
    Deng, Wei
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2022, PT II, 2022, : 265 - 275
  • [39] BERT6mA: prediction of DNA N6-methyladenine site using deep learning-based approaches
    Tsukiyama, Sho
    Hasan, Md Mehedi
    Deng, Hong-Wen
    Kurata, Hiroyuki
    BRIEFINGS IN BIOINFORMATICS, 2022, 23 (02)
  • [40] Data-Augmentation Method for BERT-based Legal Textual Entailment Systems in COLIEE Statute Law Task
    Aoki, Yasuhiro
    Yoshioka, Masaharu
    Suzuki, Youta
    REVIEW OF SOCIONETWORK STRATEGIES, 2022, 16 (01): : 175 - 196