Gaussian hierarchical latent Dirichlet allocation: Bringing polysemy back

被引:4
|
作者
Yoshida, Takahiro [1 ]
Hisano, Ryohei [2 ,4 ]
Ohnishi, Takaaki [3 ]
机构
[1] Canon Inst Global Studies, Tokyo, Japan
[2] Grad Sch Informat Sci & Technol, Tokyo, Japan
[3] Rikkyo Univ, Grad Sch Artificial Intelligence & Sci, Tokyo, Japan
[4] 7-3-1 Hongo,Bunkyo Ku, Tokyo, Japan
来源
PLOS ONE | 2023年 / 18卷 / 07期
关键词
MODELS;
D O I
10.1371/journal.pone.0288274
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Topic models are widely used to discover the latent representation of a set of documents. The two canonical models are latent Dirichlet allocation, and Gaussian latent Dirichlet allocation, where the former uses multinomial distributions over words, and the latter uses multivariate Gaussian distributions over pre-trained word embedding vectors as the latent topic representations, respectively. Compared with latent Dirichlet allocation, Gaussian latent Dirichlet allocation is limited in the sense that it does not capture the polysemy of a word such as "bank." In this paper, we show that Gaussian latent Dirichlet allocation could recover the ability to capture polysemy by introducing a hierarchical structure in the set of topics that the model can use to represent a given document. Our Gaussian hierarchical latent Dirichlet allocation significantly improves polysemy detection compared with Gaussian-based models and provides more parsimonious topic representations compared with hierarchical latent Dirichlet allocation. Our extensive quantitative experiments show that our model also achieves better topic coherence and held-out document predictive accuracy over a wide range of corpus and word embedding vectors which significantly improves the capture of polysemy compared with GLDA and CGTM. Our model learns the underlying topic distribution and hierarchical structure among topics simultaneously, which can be further used to understand the correlation among topics. Moreover, the added flexibility of our model does not necessarily increase the time complexity compared with GLDA and CGTM, which makes our model a good competitor to GLDA.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Latent Dirichlet Allocation Models for Image Classification
    Rasiwasia, Nikhil
    Vasconcelos, Nuno
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) : 2665 - 2679
  • [42] Nonstationary Latent Dirichlet Allocation for Speech Recognition
    Chueh, Chuang-Hua
    Chien, Jen-Tzung
    INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5, 2009, : 356 - 359
  • [43] Weighted Latent Dirichlet Allocation for Cluster Ensemble
    Wang, Hongjun
    Li, Zhishu
    Cheng, Yang
    SECOND INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTING: WGEC 2008, PROCEEDINGS, 2008, : 437 - 441
  • [44] Indexing by Latent Dirichlet Allocation and an Ensemble Model
    Wang, Yanshan
    Lee, Jae-Sung
    Choi, In-Chan
    JOURNAL OF THE ASSOCIATION FOR INFORMATION SCIENCE AND TECHNOLOGY, 2016, 67 (07) : 1736 - 1750
  • [45] Evaluation of Stability and Similarity of Latent Dirichlet Allocation
    Tang, Jun
    Huo, Ruilong
    Yao, Jiali
    2013 FOURTH WORLD CONGRESS ON SOFTWARE ENGINEERING (WCSE), 2013, : 78 - 83
  • [46] Latent Dirichlet Allocation modeling of environmental microbiomes
    Kim, Anastasiia
    Sevanto, Sanna
    Moore, Eric R.
    Lubbers, Nicholas
    PLOS COMPUTATIONAL BIOLOGY, 2023, 19 (06)
  • [47] Unsupervised Object Localization with Latent Dirichlet Allocation
    Yang, Tong-feng
    Ma, Jun
    2013 INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ARTIFICIAL INTELLIGENCE (ICCSAI 2013), 2013, : 230 - 234
  • [48] Latent Dirichlet Allocation for Internet Price War
    Li, Chenchen
    Yan, Xiang
    Deng, Xiaotie
    Qi, Yuan
    Chu, Wei
    Song, Le
    Qiao, Junlong
    He, Jianshan
    Xiong, Junwu
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 639 - 646
  • [49] Multi-dependent Latent Dirichlet Allocation
    Hsin, Wei-Cheng
    Huang, Jen-Wei
    2017 CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI), 2017, : 154 - 159
  • [50] Latent Dirichlet Allocation for Automatic Document Categorization
    Biro, Istvan
    Szabo, Jacint
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT II, 2009, 5782 : 430 - 441