Improving Document-Level Sentiment Classification with User-Product Gated Network

被引:0
|
作者
Tian, Bing [1 ]
Zhang, Yong [2 ]
Xing, Chunxiao [2 ]
机构
[1] Tsinghua Univ, Inst Internet Ind, RIIT, BNRist,DCST, Beijing, Peoples R China
[2] Tsinghua Univ, Inst Internet Ind, Dept Comp Sci & Technol, RIIT,BNRist, Beijing, Peoples R China
来源
基金
国家重点研发计划;
关键词
D O I
10.1007/978-3-030-85896-4_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Document-level sentiment classification is a fundamental task in Natural Language Processing (NLP). Previous studies have demonstrated the importance of personalized sentiment classification by taking user preference and product characteristics on the sentiment ratings into consideration. The state-of-the-art approaches incorporate such information via attention mechanism, where the attention weights are calculated after the texts are encoded into the low-dimensional vectors with LSTM-based models. However, user and product information may be discarded in the process of generating the semantic representations. In this paper, we propose a novel User-Product gated LSTM network (UP-LSTM), which incorporates user and product information into LSTM cells at the same time of generating text representations. Therefore, UP-LSTM can dynamically produce user- and product-aware contextual representations of texts. Moreover, we devise another version of it to improve the training efficiency. We conduct a comprehensive evaluation with three real world datasets. Experimental results show that our model outperforms previous approaches by an obvious margin.
引用
收藏
页码:397 / 412
页数:16
相关论文
共 50 条
  • [41] Document-level sentiment classification in Japanese by stem-based segmentation with category and data-source information
    Bao, Siya
    Togawa, Nozomu
    [J]. 2020 IEEE 14TH INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC 2020), 2020, : 311 - 314
  • [42] Human-Like Decision Making: Document-level Aspect Sentiment Classification via Hierarchical Reinforcement Learning
    Wang, Jingjing
    Sun, Changlong
    Li, Shoushan
    Wang, Jiancheng
    Si, Luo
    Zhang, Min
    Liu, Xiaozhong
    Zhou, Guodong
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 5581 - 5590
  • [43] A Hierarchical Network for Multimodal Document-Level Relation Extraction
    Kong, Lingxing
    Wang, Jiuliang
    Ma, Zheng
    Zhou, Qifeng
    Zhang, Jianbing
    He, Liang
    Chen, Jiajun
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 16, 2024, : 18408 - 18416
  • [44] Multi-granular document-level sentiment topic analysis for online reviews
    Huang, Faliang
    Yuan, Changan
    Bi, Yingzhou
    Lu, Jianbo
    Lu, Liqiong
    Wang, Xing
    [J]. APPLIED INTELLIGENCE, 2022, 52 (07) : 7723 - 7733
  • [45] Topic-Based Document-Level Sentiment Analysis Using Contextual Cues
    Truica, Ciprian-Octavian
    Apostol, Elena-Simona
    Serban, Maria-Luiza
    Paschke, Adrian
    [J]. MATHEMATICS, 2021, 9 (21)
  • [46] Conciseness is better: Recurrent attention LSTM model for document-level sentiment analysis
    Zhang, You
    Wang, Jin
    Zhang, Xuejie
    [J]. NEUROCOMPUTING, 2021, 462 : 101 - 112
  • [47] Multi-granular document-level sentiment topic analysis for online reviews
    Faliang Huang
    Changan Yuan
    Yingzhou Bi
    Jianbo Lu
    Liqiong Lu
    Xing Wang
    [J]. Applied Intelligence, 2022, 52 : 7723 - 7733
  • [48] Re-Engineered Word Embeddings for Improved Document-Level Sentiment Analysis
    Yang, Su
    Deravi, Farzin
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (18):
  • [49] Improving document-level event detection with event relation graph
    Zhou, Ji
    Shuang, Kai
    An, Zhenzhou
    Guo, Jinyu
    Loo, Jonathan
    [J]. INFORMATION SCIENCES, 2023, 645
  • [50] Improving Document-Level Neural Machine Translation with Domain Adaptation
    Ul Haq, Sami
    Rauf, Sadaf Abdul
    Shoukat, Arslan
    Noor-e-Hira
    [J]. NEURAL GENERATION AND TRANSLATION, 2020, : 225 - 231