Multi-Task Multi-Head Attention Memory Network for Fine-Grained Sentiment Analysis

被引:5
|
作者
Dai, Zehui [1 ]
Dai, Wei [1 ]
Liu, Zhenhua [1 ]
Rao, Fengyun [1 ]
Chen, Huajie [1 ]
Zhang, Guangpeng [1 ]
Ding, Yadong [1 ]
Liu, Jiyang [1 ]
机构
[1] Gridsum, NLP Grp, Beijing, Peoples R China
关键词
Fine-grained sentiment analysis; Multi-head Attention Memory; Multi-task learning;
D O I
10.1007/978-3-030-32233-5_47
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sentiment analysis is widely applied in personalized recommendation, business reputation monitoring, and consumer-driven product design and quality improvement. Fine-grained sentiment analysis, aimed at directly predicting sentiment polarity for multiple pre-defined fine-grained categories in an end-to-end way without having to identify aspect words, is more flexible and effective for real world applications. Constructing high performance fine-grained sentiment analysis models requires the effective use of both shared document level features and category-specific features, which most existing multi-task models fail to accomplish. In this paper, we propose an effective multi-task neural network for fine-grained sentiment analysis, Multi-Task Multi-Head Attention Memory Network (MMAM). To make full use of the shared document level features and category-specific features, our framework adopts a multi-head document attention mechanism as the memory to encode shared document features, and a multi-task attention mechanism to extract category-specific features. Experiments on two Chinese language fine-grained sentiment analysis datasets in the Restaurant-domain and Automotive-domain demonstrate that our model consistently outperforms other compared fine-grained sentiment analysis models. We believe extracting and fully utilizing document level features to establish category-specific features is an effective approach to fine-grained sentiment analysis.
引用
收藏
页码:609 / 620
页数:12
相关论文
共 50 条
  • [1] Integrating fine-grained attention into multi-task learning for knowledge tracing
    Liangliang He
    Xiao Li
    Pancheng Wang
    Jintao Tang
    Ting Wang
    [J]. World Wide Web, 2023, 26 : 3347 - 3372
  • [2] Integrating fine-grained attention into multi-task learning for knowledge tracing
    He, Liangliang
    Li, Xiao
    Wang, Pancheng
    Tang, Jintao
    Wang, Ting
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 3347 - 3372
  • [3] Memory network with hierarchical multi-head attention for aspect-based sentiment analysis
    Yuzhong Chen
    Tianhao Zhuang
    Kun Guo
    [J]. Applied Intelligence, 2021, 51 : 4287 - 4304
  • [4] Memory network with hierarchical multi-head attention for aspect-based sentiment analysis
    Chen, Yuzhong
    Zhuang, Tianhao
    Guo, Kun
    [J]. APPLIED INTELLIGENCE, 2021, 51 (07) : 4287 - 4304
  • [5] ATTENTION-BASED MULTI-TASK LEARNING FOR FINE-GRAINED IMAGE CLASSIFICATION
    Liu, Dichao
    Wang, Yu
    Mase, Kenji
    Kato, Jien
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 1499 - 1503
  • [6] ADAPTIVE MULTI-TASK LEARNING FOR FINE-GRAINED CATEGORIZATION
    Sun, Gang
    Chen, Yanyun
    Liu, Xuehui
    Wu, Enhua
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 996 - 1000
  • [7] Bimodal Fusion Network with Multi-Head Attention for Multimodal Sentiment Analysis
    Zhang, Rui
    Xue, Chengrong
    Qi, Qingfu
    Lin, Liyuan
    Zhang, Jing
    Zhang, Lun
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (03):
  • [8] Dysarthria severity classification using multi-head attention and multi-task learning
    Joshy, Amlu Anna
    Rajan, Rajeev
    [J]. SPEECH COMMUNICATION, 2023, 147 : 1 - 11
  • [9] Transdisciplinary fine-grained citation content analysis: A multi-task learning perspective for citation aspect and sentiment classification
    Kong, Ling
    Zhang, Wei
    Hu, Haotian
    Liang, Zhu
    Han, Yonggang
    Wang, Dongbo
    Song, Min
    [J]. JOURNAL OF INFORMETRICS, 2024, 18 (03)
  • [10] Fine-grained relation extraction with focal multi-task learning
    Zhang, Xinsong
    Liu, Tianyi
    Jia, Weijia
    Li, Pengshuai
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2020, 63 (06)