Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification

被引:0
|
作者
Yaojie Zhang [1 ]
Bing Xu [1 ]
Tiejun Zhao [1 ]
机构
[1] the Laboratory of Machine Intelligence and Translation, Department of Computer Science, Harbin Institute of Technology
关键词
Aspect sentiment classification; deep learning; memory network; sentiment analysis(SA);
D O I
暂无
中图分类号
TP391.1 [文字信息处理];
学科分类号
081203 ; 0835 ;
摘要
This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network(CMA-Mem Net). This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network’s inability to capture context-related information on a word-level,we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network(RNN) long short term memory(LSTM), gated recurrent unit(GRU) models, we retain the parallelism of the network. We experiment on the open datasets Sem Eval-2014 Task 4 and Sem Eval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.
引用
收藏
页码:1038 / 1044
页数:7
相关论文
共 50 条
  • [41] Multi-modal multi-head self-attention for medical VQA
    Vasudha Joshi
    Pabitra Mitra
    Supratik Bose
    [J]. Multimedia Tools and Applications, 2024, 83 : 42585 - 42608
  • [42] Personalized multi-head self-attention network for news recommendation
    Zheng, Cong
    Song, Yixuan
    [J]. Neural Networks, 2025, 181
  • [43] Improved Multi-Head Self-Attention Classification Network for Multi-View Fetal Echocardiography Recognition
    Zhang, Yingying
    Zhu, Haogang
    Wang, Yan
    Wang, Jingyi
    He, Yihua
    [J]. 2023 45TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY, EMBC, 2023,
  • [44] Deep Multi-Head Attention Network for Aspect-Based Sentiment Analysis
    Yan, Danfeng
    Chen, Jiyuan
    Cui, Jianfei
    Shan, Ao
    Shi, Wenting
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 695 - 700
  • [45] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793
  • [46] Multi-Head Self-Attention Gated-Dilated Convolutional Neural Network for Word Sense Disambiguation
    Zhang, Chun-Xiang
    Zhang, Yu-Long
    Gao, Xue-Yao
    [J]. IEEE ACCESS, 2023, 11 : 14202 - 14210
  • [47] An Effective Hyperspectral Image Classification Network Based on Multi-Head Self-Attention and Spectral-Coordinate Attention
    Zhang, Minghua
    Duan, Yuxia
    Song, Wei
    Mei, Haibin
    He, Qi
    [J]. JOURNAL OF IMAGING, 2023, 9 (07)
  • [48] Incorporating temporal multi-head self-attention convolutional networks and LightGBM for indoor air quality prediction
    Lu, Yifeng
    Wang, Jinyong
    Wang, Dongsheng
    Yoo, Changkyoo
    Liu, Hongbin
    [J]. APPLIED SOFT COMPUTING, 2024, 157
  • [49] SPEECH ENHANCEMENT USING SELF-ADAPTATION AND MULTI-HEAD SELF-ATTENTION
    Koizumi, Yuma
    Yatabe, Kohei
    Delcroix, Marc
    Masuyama, Yoshiki
    Takeuchi, Daiki
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 181 - 185
  • [50] Enlivening Redundant Heads in Multi-head Self-attention for Machine Translation
    Zhang, Tianfu
    Huang, Heyan
    Feng, Chong
    Cao, Longbing
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3238 - 3248