Semantic Vector Learning Using Pretrained Transformers in Natural Language Understanding

被引:0
|
作者
Jung S. [1 ]
机构
[1] Chungnam National University, Daejeon
来源
Jung, Sangkeun (hugman@cnu.ac.kr) | 1600年 / Korean Institute of Information Scientists and Engineers卷 / 16期
关键词
Natural language understanding; Semantic vector; Semantic vector learning; Transformer;
D O I
10.5626/JCSE.2020.14.4.154
中图分类号
学科分类号
摘要
Natural language understanding (NLU) is a core technology for implementing natural interfaces. To implement and support robust NLU, previous studies introduced a neural network approach to learn semantic vector representation by employing the correspondence between text and semantic frame texts as extracted semantic knowledge. In their work, long short-term memory (LSTM)-based text and readers were used to encode both text and semantic frames. However, there exists significant room for performance improvement using recent pretrained transformer encoders. In the present work, as a key contribution, we have extended Jung’s framework to work with pretrained transformers for both text and semantic frame readers. In particular, a novel semantic frame processing method is proposed to directly feed the structural form of the semantic frame to transformers. We conducted massive experiments by combining various types of LSTM- or transformer-based text and semantic frame readers on the ATIS, SNIPS, Sim-M, Sim-R, and Weather datasets to find the best suitable configurations for learning effective semantic vector representations. Through the experiments, we concluded that the transformer-based text and semantic frame reader show a stable and rapid learning curve as well as the best performance in similarity-based intent classification and semantic search tasks. Copyright 2020. The Korean Institute of Information Scientists and Engineers
引用
收藏
页码:154 / 162
页数:8
相关论文
共 50 条
  • [2] Semantic vector learning for natural language understanding
    Jung, Sangkeun
    [J]. COMPUTER SPEECH AND LANGUAGE, 2019, 56 : 130 - 145
  • [3] Cluster-aware Semantic Vector Learning using BERT in Natural Language Understanding
    Jung, Sangkeun
    Lim, Sungsu
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2021), 2021, : 91 - 98
  • [4] Syntax Vector Learning Using Correspondence for Natural Language Understanding
    Seo, Hyein
    Jung, Sangkeun
    Hwang, Taewook
    Kim, Hyunji
    Roh, Yoon-Hyung
    [J]. IEEE ACCESS, 2021, 9 : 84067 - 84078
  • [5] Learning Executable Semantic Parsers for Natural Language Understanding
    Liang, Percy
    [J]. COMMUNICATIONS OF THE ACM, 2016, 59 (09) : 68 - 76
  • [6] Survey of transformers and towards ensemble learning using transformers for natural language processing
    Zhang, Hongzhi
    Shafiq, M. Omair
    [J]. JOURNAL OF BIG DATA, 2024, 11 (01)
  • [7] Survey of transformers and towards ensemble learning using transformers for natural language processing
    Hongzhi Zhang
    M. Omair Shafiq
    [J]. Journal of Big Data, 11
  • [8] CLINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding
    Wang, Dong
    Ding, Ning
    Li, Piji
    Zheng, Hai-Tao
    [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2332 - 2342
  • [9] Transfer Learning of Transformers for Spoken Language Understanding
    Svec, Jan
    Fremund, Adam
    Bulin, Martin
    Lehecka, Jan
    [J]. TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 489 - 500
  • [10] Efficient Transformers for on-robot Natural Language Understanding
    Greco, Antonio
    Roberto, Antonio
    Saggese, Alessia
    Vento, Mario
    [J]. 2022 IEEE-RAS 21ST INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2022, : 823 - 828