Multi-Head Self-Attention Transformation Networks for Aspect-Based Sentiment Analysis

被引:25
|
作者
Lin, Yuming [1 ]
Wang, Chaoqiang [1 ]
Song, Hao [1 ]
Li, You [1 ]
机构
[1] Guilin Univ Elect Technol, Guangxi Key Lab Trusted Software, Guilin 541004, Peoples R China
来源
IEEE ACCESS | 2021年 / 9卷
基金
中国国家自然科学基金;
关键词
Sentiment analysis; Task analysis; Context modeling; Licenses; Feature extraction; Analytical models; Standards; Aspect-based sentiment analysis; self-attention; transformation networks; target-specific transformation;
D O I
10.1109/ACCESS.2021.3049294
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aspect-based sentiment analysis (ABSA) aims to analyze the sentiment polarity of an input sentence in a certain aspect. Many existing methods of ABSA employ long short-term memory (LSTM) networks and attention mechanism. However, the attention mechanism only models the local certain dependencies of the input information, which fails to capture the global dependence of the inputs. Simply improving the attention mechanism fails to solve the issue of target-sensitive sentiment expression, which has been proven to degrade the prediction effectiveness. In this work, we propose the multi-head self-attention transformation (MSAT) networks for ABSA tasks, which conducts more effective sentiment analysis with target specific self-attention and dynamic target representation. Given a set of review sentences, MSAT applies multi-head target specific self-attention to better capture the global dependence and introduces target-sensitive transformation to effectively tackle the problem of target-sensitive sentiment at first. Second, the part-of-speech (POS) features are integrated into MSAT to capture the grammatical features of sentences. A series of experiments carried on the SemEval 2014 and Twitter datasets show that the proposed model achieves better effectiveness compared with several state-of-the-art methods.
引用
收藏
页码:8762 / 8770
页数:9
相关论文
共 50 条
  • [21] Adaptive Pruning for Multi-Head Self-Attention
    Messaoud, Walid
    Trabelsi, Rim
    Cabani, Adnane
    Abdelkefi, Fatma
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 48 - 57
  • [22] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    [J]. PLOS ONE, 2024, 19 (06):
  • [23] Microblog Sentiment Analysis with Multi-Head Self-Attention Pooling and Multi-Granularity Feature Interaction Fusion
    Yan, Shangyi
    Wang, Jingya
    Liu, Xiaowen
    Cui, Yumeng
    Tao, Zhizhong
    Zhang, Xiaofan
    [J]. Data Analysis and Knowledge Discovery, 2023, 7 (04) : 32 - 45
  • [24] Position-Enhanced Multi-Head Self-Attention Based Bidirectional Gated Recurrent Unit for Aspect-Level Sentiment Classification
    Li, Xianyong
    Ding, Li
    Du, Yajun
    Fan, Yongquan
    Shen, Fashan
    [J]. FRONTIERS IN PSYCHOLOGY, 2022, 12
  • [25] Aspect-based sentiment analysis for online reviews with hybrid attention networks
    Yuming Lin
    Yu Fu
    You Li
    Guoyong Cai
    Aoying Zhou
    [J]. World Wide Web, 2021, 24 : 1215 - 1233
  • [26] Multimodal sentiment analysis based on multi-head attention mechanism
    Xi, Chen
    Lu, Guanming
    Yan, Jingjie
    [J]. ICMLSC 2020: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, 2020, : 34 - 39
  • [27] Aspect-based sentiment analysis for online reviews with hybrid attention networks
    Lin, Yuming
    Fu, Yu
    Li, You
    Cai, Guoyong
    Zhou, Aoying
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2021, 24 (04): : 1215 - 1233
  • [28] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [29] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [30] Lane Detection Method Based on Improved Multi-Head Self-Attention
    Ge, Zekun
    Tao, Fazhan
    Fu, Zhumu
    Song, Shuzhong
    [J]. Computer Engineering and Applications, 2024, 60 (02) : 264 - 271