Multi-Head Self-Attention Transformation Networks for Aspect-Based Sentiment Analysis

被引:25
|
作者
Lin, Yuming [1 ]
Wang, Chaoqiang [1 ]
Song, Hao [1 ]
Li, You [1 ]
机构
[1] Guilin Univ Elect Technol, Guangxi Key Lab Trusted Software, Guilin 541004, Peoples R China
基金
中国国家自然科学基金;
关键词
Sentiment analysis; Task analysis; Context modeling; Licenses; Feature extraction; Analytical models; Standards; Aspect-based sentiment analysis; self-attention; transformation networks; target-specific transformation;
D O I
10.1109/ACCESS.2021.3049294
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aspect-based sentiment analysis (ABSA) aims to analyze the sentiment polarity of an input sentence in a certain aspect. Many existing methods of ABSA employ long short-term memory (LSTM) networks and attention mechanism. However, the attention mechanism only models the local certain dependencies of the input information, which fails to capture the global dependence of the inputs. Simply improving the attention mechanism fails to solve the issue of target-sensitive sentiment expression, which has been proven to degrade the prediction effectiveness. In this work, we propose the multi-head self-attention transformation (MSAT) networks for ABSA tasks, which conducts more effective sentiment analysis with target specific self-attention and dynamic target representation. Given a set of review sentences, MSAT applies multi-head target specific self-attention to better capture the global dependence and introduces target-sensitive transformation to effectively tackle the problem of target-sensitive sentiment at first. Second, the part-of-speech (POS) features are integrated into MSAT to capture the grammatical features of sentences. A series of experiments carried on the SemEval 2014 and Twitter datasets show that the proposed model achieves better effectiveness compared with several state-of-the-art methods.
引用
收藏
页码:8762 / 8770
页数:9
相关论文
共 50 条
  • [31] Multi-Head Self-Attention Generative Adversarial Networks for Multiphysics Topology Optimization
    Parrott, Corey M.
    Abueidda, Diab W.
    James, Kai A.
    [J]. AIAA JOURNAL, 2023, 61 (02) : 726 - 738
  • [32] Speech enhancement method based on the multi-head self-attention mechanism
    Chang, Xinxu
    Zhang, Yang
    Yang, Lin
    Kou, Jinqiao
    Wang, Xin
    Xu, Dongdong
    [J]. Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [33] Enhanced Multi-Head Self-Attention Graph Neural Networks for Session-based Recommendation
    Pan, Wenhao
    Yang, Kai
    [J]. ENGINEERING LETTERS, 2022, 30 (01) : 37 - 44
  • [34] Aspect-Based Sentiment Analysis Model Based on Dual-Attention Networks
    Sun, Xiaowan
    Wang, Ying
    Wang, Xin
    Sun, Yudong
    [J]. Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2019, 56 (11): : 2384 - 2395
  • [35] Text Sentiment Classification Based on BERT Embedding and Sliced Multi-Head Self-Attention Bi-GRU
    Zhang, Xiangsen
    Wu, Zhongqiang
    Liu, Ke
    Zhao, Zengshun
    Wang, Jinhao
    Wu, Chengqin
    [J]. SENSORS, 2023, 23 (03)
  • [36] Attention-based Sentiment Reasoner for aspect-based sentiment analysis
    Liu, Ning
    Shen, Bo
    Zhang, Zhenjiang
    Zhang, Zhiyuan
    Mi, Kun
    [J]. HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 2019, 9 (01)
  • [37] Graph convolutional networks with hierarchical multi-head attention for aspect-level sentiment classification
    Li, Xiaowen
    Lu, Ran
    Liu, Peiyu
    Zhu, Zhenfang
    [J]. JOURNAL OF SUPERCOMPUTING, 2022, 78 (13): : 14846 - 14865
  • [38] Graph convolutional networks with hierarchical multi-head attention for aspect-level sentiment classification
    Xiaowen Li
    Ran Lu
    Peiyu Liu
    Zhenfang Zhu
    [J]. The Journal of Supercomputing, 2022, 78 : 14846 - 14865
  • [39] Sentiment Analysis of Text Based on Bidirectional LSTM With Multi-Head Attention
    Long, Fei
    Zhou, Kai
    Ou, Weihua
    [J]. IEEE ACCESS, 2019, 7 : 141960 - 141969
  • [40] Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
    Zhang, Dianyuan
    Zhu, Zhenfang
    Lu, Qiang
    Pei, Hongli
    Wu, Wenqing
    Guo, Qiangqiang
    [J]. APPLIED SCIENCES-BASEL, 2020, 10 (06): : 1 - 15