Semantic vector learning for natural language understanding

被引:20
|
作者
Jung, Sangkeun [1 ]
机构
[1] Chungnam Natl Univ, Dept Comp Sci & Engn, Daejeon, South Korea
来源
关键词
Natural language understanding; Semantic frame learning; Deep learning; Distributed representation; Semantic vector; Semantic Corpus Visualization; MODELS;
D O I
10.1016/j.csl.2018.12.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Natural language understanding (NLU) is a core technology for implementing natural interfaces and has received much attention in recent years. While learning embedding models has yielded fruitful results in several NLP subfields, most notably Word2-Vec, embedding correspondence has relatively not been well explored especially in the context of NLU, a task that typically extracts structured semantic knowledge from a text. A NLU embedding model can facilitate analyzing and understanding relationships between unstructured texts and their corresponding structured semantic knowledge, essential for both researchers and practitioners of NLU. Toward this end, we propose a framework that learns to embed semantic correspondence between text and its extracted semantic knowledge, called semantic frame. One key contributed technique is semantic frame reconstruction used to derive a one-to-one mapping between embedded vectors and their corresponding semantic frames. Embedding into semantically meaningful vectors and computing their distances in vector space provides a simple, but effective way to measure semantic similarities. With the proposed framework, we demonstrate three key areas where the embedding model can be effective: visualization, distance based semantic search, similarity-based intent classification and re-ranking. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:130 / 145
页数:16
相关论文
共 50 条
  • [1] Semantic Vector Learning Using Pretrained Transformers in Natural Language Understanding
    Jung, Sangkeun
    [J]. Journal of Computing Science and Engineering, 2020, 16 (04): : 154 - 162
  • [2] Semantic Vector Learning and Visualization with Semantic Cluster Using Transformers in Natural Language Understanding
    Jung, Sangkeun
    [J]. Journal of Computing Science and Engineering, 2022, 16 (02) : 63 - 78
  • [3] Cluster-aware Semantic Vector Learning using BERT in Natural Language Understanding
    Jung, Sangkeun
    Lim, Sungsu
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2021), 2021, : 91 - 98
  • [4] Learning Executable Semantic Parsers for Natural Language Understanding
    Liang, Percy
    [J]. COMMUNICATIONS OF THE ACM, 2016, 59 (09) : 68 - 76
  • [5] Syntax Vector Learning Using Correspondence for Natural Language Understanding
    Seo, Hyein
    Jung, Sangkeun
    Hwang, Taewook
    Kim, Hyunji
    Roh, Yoon-Hyung
    [J]. IEEE ACCESS, 2021, 9 : 84067 - 84078
  • [6] CLINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding
    Wang, Dong
    Ding, Ning
    Li, Piji
    Zheng, Hai-Tao
    [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2332 - 2342
  • [7] Hybrid heuristic semantic networks in natural language understanding
    Corbett, D
    [J]. INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 1995, 3 (04) : 389 - 400
  • [8] THE APPLICATION OF SEMANTIC CLASSIFICATION TREES TO NATURAL-LANGUAGE UNDERSTANDING
    KUHN, R
    DEMORI, R
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1995, 17 (05) : 449 - 460
  • [9] QUASI NATURAL-LANGUAGE UNDERSTANDING IN THE SEMANTIC DOMAIN OF ROBOTICS
    BERNORIO, M
    BERTONI, M
    DABBENE, A
    SOMALVICO, M
    [J]. CYBERNETICA, 1979, 22 (02): : 159 - 172
  • [10] Learning Structured Natural Language Representations for Semantic Parsing
    Cheng, Jianpeng
    Reddy, Siva
    Saraswat, Vijay
    Lapata, Mirella
    [J]. PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 44 - 55