Dynamic Joint Sentiment-Topic Model

被引:36
|
作者
He, Yulan [1 ]
Lin, Chenghua [2 ]
Gao, Wei [3 ]
Wong, Kam-Fai [4 ]
机构
[1] Aston Univ, Sch Engn & Appl Sci, Birmingham B4 7ET, W Midlands, England
[2] Open Univ, Knowledge Media Inst, St Andrews, Fife, Scotland
[3] Qatar Fdn, Qatar Comp Res Inst, Doha, Qatar
[4] Chinese Univ Hong Kong, Dept Syst Engn & Engn Management, Hong Kong, Hong Kong, Peoples R China
基金
英国工程与自然科学研究理事会;
关键词
Algorithms; Experimentation; Dynamic joint sentiment-topic model; sentiment analysis; opinion mining; topic model;
D O I
10.1145/2542182.2542188
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Social media data are produced continuously by a large and uncontrolled number of users. The dynamic nature of such data requires the sentiment and topic analysis model to be also dynamically updated, capturing the most recent language use of sentiments and topics in text. We propose a dynamic Joint Sentiment-Topic model (dJST) which allows the detection and tracking of views of current and recurrent interests and shifts in topic and sentiment. Both topic and sentiment dynamics are captured by assuming that the current sentiment-topic-specific word distributions are generated according to the word distributions at previous epochs. We study three different ways of accounting for such dependency information: (1) sliding window where the current sentiment-topic word distributions are dependent on the previous sentiment-topic-specific word distributions in the last S epochs; (2) skip model where history sentiment topic word distributions are considered by skipping some epochs in between; and (3) multiscale model where previous long- and short-timescale distributions are taken into consideration. We derive efficient online inference procedures to sequentially update the model with newly arrived data and show the effectiveness of our proposed model on the Mozilla add-on reviews crawled between 2007 and 2011.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] SENTIMENT ANALYSIS OF MICROBLOG TEXT BASED ON JOINT SENTIMENT-TOPIC MODEL
    Zhang, Hui
    Liu, Yiqun
    Ma, Shaoping
    [J]. 2014 IEEE 3RD INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND INTELLIGENCE SYSTEMS (CCIS), 2014, : 46 - 54
  • [2] STC: A Joint Sentiment-Topic Model for Community Identification
    Yang, Baoguo
    Manandhar, Suresh
    [J]. TRENDS AND APPLICATIONS IN KNOWLEDGE DISCOVERY AND DATA MINING, 2014, 8643 : 535 - 548
  • [3] Opinion Mining Using Enriched Joint Sentiment-Topic Model
    Osmani, Amjad
    Mohasefi, Jamshid Bagherzadeh
    [J]. INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY & DECISION MAKING, 2023, 22 (01) : 313 - 375
  • [4] LJST: A Semi-supervised Joint Sentiment-Topic Model for Short Texts
    Sengupta A.
    Roy S.
    Ranjan G.
    [J]. SN Computer Science, 2021, 2 (4)
  • [5] Unsupervised Sentiment Classification: A Hybrid Sentiment-Topic Model Approach
    Blair, Stuart J.
    Bi, Yaxin
    Mulvenna, Maurice D.
    [J]. 2017 IEEE 29TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2017), 2017, : 453 - 460
  • [6] Weakly Supervised Joint Sentiment-Topic Detection from Text
    Lin, Chenghua
    He, Yulan
    Everson, Richard
    Rueger, Stefan
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2012, 24 (06) : 1134 - 1145
  • [7] A short text sentiment-topic model for product reviews
    Xiong, Shufeng
    Wang, Kuiyi
    Ji, Donghong
    Wang, Bingkun
    [J]. NEUROCOMPUTING, 2018, 297 : 94 - 102
  • [8] Monitoring of user-generated reviews via a sequential reverse joint sentiment-topic model
    Liang, Qiao
    Wang, Kaibo
    [J]. QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, 2019, 35 (04) : 1180 - 1199
  • [9] The effect of online reviews on product sales: A joint sentiment-topic analysis
    Li, Xiaolin
    Wu, Chaojiang
    Mai, Feng
    [J]. INFORMATION & MANAGEMENT, 2019, 56 (02) : 172 - 184
  • [10] Trending Sentiment-Topic Detection on Twitter
    Peng, Baolin
    Li, Jing
    Chen, Junwen
    Han, Xu
    Xu, Ruifeng
    Wong, Kam-Fai
    [J]. COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING (CICLING 2015), PT II, 2015, 9042 : 66 - 77