Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification

被引:0
|
作者
Yaojie Zhang [1 ]
Bing Xu [1 ]
Tiejun Zhao [1 ]
机构
[1] the Laboratory of Machine Intelligence and Translation, Department of Computer Science, Harbin Institute of Technology
关键词
Aspect sentiment classification; deep learning; memory network; sentiment analysis(SA);
D O I
暂无
中图分类号
TP391.1 [文字信息处理];
学科分类号
081203 ; 0835 ;
摘要
This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network(CMA-Mem Net). This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network’s inability to capture context-related information on a word-level,we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network(RNN) long short term memory(LSTM), gated recurrent unit(GRU) models, we retain the parallelism of the network. We experiment on the open datasets Sem Eval-2014 Task 4 and Sem Eval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.
引用
收藏
页码:1038 / 1044
页数:7
相关论文
共 50 条
  • [21] ViolenceNet: Dense Multi-Head Self-Attention with Bidirectional Convolutional LSTM for Detecting Violence
    Rendon-Segador, Fernando J.
    Alvarez-Garcia, Juan A.
    Enriquez, Fernando
    Deniz, Oscar
    [J]. ELECTRONICS, 2021, 10 (13)
  • [22] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [23] Filter gate network based on multi-head attention for aspect-level sentiment classification
    Zhou, Ziyu
    Liu, Fang'ai
    [J]. NEUROCOMPUTING, 2021, 441 : 214 - 225
  • [24] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    [J]. Multimedia Tools and Applications, 2021, 80 : 12581 - 12600
  • [25] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Leng, Xue-Liang
    Miao, Xiao-Ai
    Liu, Tao
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12581 - 12600
  • [26] Aspect Based Sentiment Analysis with Self-Attention and Gated Convolutional Networks
    Yang, Jian
    Yang, Juan
    [J]. PROCEEDINGS OF 2020 IEEE 11TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS 2020), 2020, : 146 - 149
  • [27] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    [J]. 2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [28] A Point Cloud Classification Method and Its Applications Based on Multi-Head Self-Attention
    Department of Information Engineering, Beijing Institute of Petrochemical Technology, Beijing
    102617, China
    [J]. J. Comput., 2023, 4 (163-173): : 163 - 173
  • [29] Microblog Sentiment Analysis with Multi-Head Self-Attention Pooling and Multi-Granularity Feature Interaction Fusion
    Yan, Shangyi
    Wang, Jingya
    Liu, Xiaowen
    Cui, Yumeng
    Tao, Zhizhong
    Zhang, Xiaofan
    [J]. Data Analysis and Knowledge Discovery, 2023, 7 (04) : 32 - 45
  • [30] MSASGCN : Multi-Head Self-Attention Spatiotemporal Graph Convolutional Network for Traffic Flow Forecasting
    Cao, Yang
    Liu, Detian
    Yin, Qizheng
    Xue, Fei
    Tang, Hengliang
    [J]. JOURNAL OF ADVANCED TRANSPORTATION, 2022, 2022