Cluster-aware Semantic Vector Learning using BERT in Natural Language Understanding

被引:1
|
作者
Jung, Sangkeun [1 ]
Lim, Sungsu [1 ]
机构
[1] Chugnam Natl Univ, Dept Comp Sci & Engn, Daejeon, South Korea
基金
新加坡国家研究基金会;
关键词
natural language understanding; semantic vector learning; cluster-aware modeling;
D O I
10.1109/BigComp51126.2021.00026
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Natural language understanding (NLU) is a core technology for implementing natural interfaces. Recently, embedding sentences and correspondence between texts as extracted semantic knowledge, called semantic frame, has shown that semantic vector representation is key for implementing or supporting robust NLU systems. However, existing studies pertain to only the relations between sentences or only the correspondence between sentences and semantic frames, and do not consider the many -to-I relationship of text-to-semantic frames and semantic clusters. Herein, we propose a novel framework that learns semantic cluster-aware vector representations using bidirectional encoder representations from transfonners(BERT). A key technique is cohesion modeling for pulling paraphrase texts to semantic centroids. Another technique is separation modeling for pushing different clusters away by employing a triplet margin loss. Additionally, we propose a novel semantic frame-encoding method using bidirectional encoder representations from transformers(BERT). Using the proposed framework, we demonstrate that the proposed model can learn meaningful semantic vector representations.
引用
收藏
页码:91 / 98
页数:8
相关论文
共 50 条
  • [2] Semantic vector learning for natural language understanding
    Jung, Sangkeun
    [J]. COMPUTER SPEECH AND LANGUAGE, 2019, 56 : 130 - 145
  • [3] Semantic Vector Learning Using Pretrained Transformers in Natural Language Understanding
    Jung S.
    [J]. Jung, Sangkeun (hugman@cnu.ac.kr), 1600, Korean Institute of Information Scientists and Engineers (16): : 154 - 162
  • [4] Syntax Vector Learning Using Correspondence for Natural Language Understanding
    Seo, Hyein
    Jung, Sangkeun
    Hwang, Taewook
    Kim, Hyunji
    Roh, Yoon-Hyung
    [J]. IEEE ACCESS, 2021, 9 : 84067 - 84078
  • [5] ClusterSCL: Cluster-Aware Supervised Contrastive Learning on Graphs
    Wang, Yanling
    Zhang, Jing
    Li, Haoyang
    Dong, Yuxiao
    Yin, Hongzhi
    Li, Cuiping
    Chen, Hong
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1611 - 1621
  • [6] Semantics-Aware BERT for Language Understanding
    Zhang, Zhuosheng
    Wu, Yuwei
    Zhao, Hai
    Li, Zuchao
    Zhang, Shuailiang
    Zhou, Xi
    Zhou, Xiang
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9628 - 9635
  • [7] On Cluster-Aware Supervised Learning: Frameworks, Convergent Algorithms, and Applications
    Chen, Shutong
    Xie, Weijun
    [J]. INFORMS JOURNAL ON COMPUTING, 2022, 34 (01) : 481 - 502
  • [8] Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization
    Li, Jin
    Li, Bingshi
    Zhang, Qirong
    Chen, Xinlong
    Huang, Xinyang
    Guo, Longkun
    Fu, Yang-Geng
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT II, 2023, 14170 : 666 - 682
  • [9] Cluster-aware multiplex InfoMax for unsupervised graph representation learning
    Xu, Xin
    Du, Junping
    Song, Jie
    Xue, Zhe
    Li, Ang
    Guan, Zeli
    [J]. NEUROCOMPUTING, 2023, 532 : 94 - 105
  • [10] Accelerate distributed deep learning with cluster-aware sketch quantization
    Keshi GE
    Yiming ZHANG
    Yongquan FU
    Zhiquan LAI
    Xiaoge DENG
    Dongsheng LI
    [J]. Science China(Information Sciences), 2023, 66 (06) : 138 - 154