Cycling topic graph learning for neural topic modeling

被引:0
|
作者
Liu, Yanyan [1 ,2 ,3 ]
Gong, Zhiguo [1 ,2 ,3 ]
机构
[1] Univ Macau, State Key Lab Internet Things Smart City, Macau 999078, Peoples R China
[2] Univ Macau, Guangdong Macau Joint Lab Adv & Intelligent Comp, Macau 999078, Peoples R China
[3] Univ Macau, Dept Comp & Informat Sci, Macau 999078, Peoples R China
关键词
Neural topic model; Graph neural networks; Wasserstein autoencoder; Graph attention networks;
D O I
10.1016/j.knosys.2024.112905
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Topic models aim to discover a set of latent topics in a textual corpus. Graph Neural Networks (GNNs) have been recently utilized in Neural Topic Models (NTMs) due to their strong capacity to model document representations with the text graph. Most of the previous works construct the text graph by considering documents and words as nodes and document embeddings are learned through the topology structure of the text graph. However, while conducting graph learning on topic modeling, sorely considering document- word propagation will lose the guidance of topic relevance and the graph propagation cannot reflect the true relationship at the topic level which will result in inaccurate topic extraction. To address the above-mentioned issue, we propose a novel neural topic model based on Cycling Topic Graph Learning (CyTGL). Specifically, we design a novel three-party topic graph for document-topic-word to incorporate topic propagation into graph-based topic models. In the three-party topic graph, the topic layer is latent and we recursively extract the topic layer through the learning process. Leveraging this topic graph, we employ topic attention message passing to propagate topical information to enhance the document representations. What is more, the topic layer in the three-party graph can be regarded as the prior knowledge that offers guidance for the process of topic extraction. Crucially, the hierarchical relationships in the three-party graph are maintained during the learning process. We conduct experiments on several widely used datasets and the results show our proposed approach outperforms state-of-the-art topic models.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] A Joint Learning Approach for Semi-supervised Neural Topic Modeling
    Chiu, Jeffrey
    Mittal, Rajat
    Tumma, Neehal
    Sharma, Abhishek
    Doshi-Velez, Finale
    PROCEEDINGS OF THE SIXTH WORKSHOP ON STRUCTURED PREDICTION FOR NLP (SPNLP 2022), 2022, : 40 - 51
  • [22] A Topic Modeling Based on Prompt Learning
    Qiu, Mingjie
    Yang, Wenzhong
    Wei, Fuyuan
    Chen, Mingliang
    ELECTRONICS, 2024, 13 (16)
  • [23] A Local Explainability Technique for Graph Neural Topic Models
    Bharathwajan Rajendran
    Chandran G. Vidya
    J. Sanil
    S. Asharaf
    Human-Centric Intelligent Systems, 2024, 4 (1): : 53 - 76
  • [24] Mining technology trends in scientific publications: a graph propagated neural topic modeling approach
    Du, Chenguang
    Yao, Kaichun
    Zhu, Hengshu
    Wang, Deqing
    Zhuang, Fuzhen
    Xiong, Hui
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (05) : 3085 - 3114
  • [25] Mining technology trends in scientific publications: a graph propagated neural topic modeling approach
    Chenguang Du
    Kaichun Yao
    Hengshu Zhu
    Deqing Wang
    Fuzhen Zhuang
    Hui Xiong
    Knowledge and Information Systems, 2024, 66 : 3085 - 3114
  • [26] Graph Neural Collaborative Topic Model for Citation Recommendation
    Xie, Qianqian
    Zhu, Yutao
    Huang, Jimin
    Du, Pan
    Nie, Jian-Yun
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2022, 40 (03)
  • [27] Hierarchical neural topic modeling with manifold regularization
    Ziye Chen
    Cheng Ding
    Yanghui Rao
    Haoran Xie
    Xiaohui Tao
    Gary Cheng
    Fu Lee Wang
    World Wide Web, 2021, 24 : 2139 - 2160
  • [28] Coherence-Aware Neural Topic Modeling
    Ding, Ran
    Nallapati, Ramesh
    Xiang, Bing
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 830 - 836
  • [29] Leveraging spiking neural networks for topic modeling
    Bialas, Marcin
    Mironczuk, Marcin Michal
    Mandziuk, Jacek
    NEURAL NETWORKS, 2024, 178
  • [30] Neural Topic Modeling with Bidirectional Adversarial Training
    Wang, Rui
    Hu, Xuemeng
    Zhou, Deyu
    He, Yulan
    Xiong, Yuxuan
    Ye, Chenchen
    Xu, Haiyang
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 340 - 350