Sentiment Analysis of Text Based on Bidirectional LSTM With Multi-Head Attention

被引:47
|
作者
Long, Fei [1 ,2 ]
Zhou, Kai [2 ]
Ou, Weihua [3 ]
机构
[1] Guizhou Inst Technol, Sch Elect Engn, Guiyang 550003, Guizhou, Peoples R China
[2] Guizhou Univ, Coll Big Data & Informat Engn, Guiyang 550025, Guizhou, Peoples R China
[3] Guizhou Normal Univ, Sch Big Data & Comp Sci, Guiyang 550025, Guizhou, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
基金
中国国家自然科学基金;
关键词
Sentiment analysis; Training; Encoding; Task analysis; Dictionaries; Semantics; Social networking (online); Chinese product reviews text; bidirectional LSTM; multi-head attention mechanism; NEURAL-NETWORKS;
D O I
10.1109/ACCESS.2019.2942614
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent years, many scientists address the research on text sentiment analysis of social media due to the exponential growth of social multimedia content. Natural language ambiguities and indirect sentiments within the social media text have made it hard to classify by using traditional machine learning approaches, such as support vector machines, naive Bayes, hybrid models and so on. This article aims to investigate the sentiment analysis of social media Chinese text by combining Bidirectional Long-Short Term Memory (BiLSTM) networks with a Multi-head Attention (MHAT) mechanism in order to overcome the deficiency of Sentiment Analysis that is performed with traditional machine learning. BiLSTM networks, not only solve the long-term dependency problem, but they also capture the actual context of the text. Due to the fact that the MHAT mechanism can learn the relevant information from a different representation subspace by using multiple distributed calculations, the purpose is to add influence weights to the constructed text sequence. The results of the numerical experiments show that the proposed model achieves better performance than the existing well-established methods.
引用
收藏
页码:141960 / 141969
页数:10
相关论文
共 50 条
  • [1] Emotion-Semantic-Enhanced Bidirectional LSTM with Multi-Head Attention Mechanism for Microblog Sentiment Analysis
    Wang, Shaoxiu
    Zhu, Yonghua
    Gao, Wenjing
    Cao, Meng
    Li, Mengyao
    [J]. INFORMATION, 2020, 11 (05)
  • [2] Sarcasm Detection Using Multi-Head Attention Based Bidirectional LSTM
    Kumar, Avinash
    Narapareddy, Vishnu Teja
    Aditya Srikanth, Veerubhotla
    Malapati, Aruna
    Neti, Lalita Bhanu Murthy
    [J]. IEEE ACCESS, 2020, 8 : 6388 - 6397
  • [3] Short Text Sentiment Analysis Based on Multi-Channel CNN With Multi-Head Attention Mechanism
    Feng, Yue
    Cheng, Yan
    [J]. IEEE ACCESS, 2021, 9 : 19854 - 19863
  • [4] Multimodal sentiment analysis based on multi-head attention mechanism
    Xi, Chen
    Lu, Guanming
    Yan, Jingjie
    [J]. ICMLSC 2020: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, 2020, : 34 - 39
  • [5] Sentiment Analysis with An Integrated Model of BERT and Bi-LSTM Based on Multi-Head Attention Mechanism
    Wang, Yahui
    Cheng, Xiaoqing
    Meng, Xuelei
    [J]. IAENG International Journal of Computer Science, 2023, 50 (01)
  • [6] The sentiment analysis model with multi-head self-attention and Tree-LSTM
    Li Lei
    Pei Yijian
    Jin Chenyang
    [J]. SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913
  • [7] Sentiment Analysis Using Multi-Head Attention Capsules With Multi-Channel CNN and Bidirectional GRU
    Cheng, Yan
    Sun, Huan
    Chen, Haomai
    Li, Meng
    Cai, Yingying
    Cai, Zhuang
    Huang, Jing
    [J]. IEEE ACCESS, 2021, 9 : 60383 - 60395
  • [8] Multi-head attention model for aspect level sentiment analysis
    Zhang, Xinsheng
    Gao, Teng
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 38 (01) : 89 - 96
  • [9] Deep Multi-Head Attention Network for Aspect-Based Sentiment Analysis
    Yan, Danfeng
    Chen, Jiyuan
    Cui, Jianfei
    Shan, Ao
    Shi, Wenting
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 695 - 700
  • [10] Automatic scene generation using sentiment analysis and bidirectional recurrent neural network with multi-head attention
    Dharaniya, R.
    Indumathi, J.
    Uma, G. V.
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (19): : 16945 - 16958