Microblog Sentiment Analysis Based on Dynamic Character-Level and Word-Level Features and Multi-Head Self-Attention Pooling

被引:4
|
作者
Yan, Shangyi [1 ]
Wang, Jingya [1 ]
Song, Zhiqiang [1 ]
机构
[1] Peoples Publ Secur Univ China, Coll Informat & Cyber Secur, Beijing 100038, Peoples R China
来源
FUTURE INTERNET | 2022年 / 14卷 / 08期
关键词
dynamic character and word encoding; multi-head self-attention pooling; multi-granularity feature interactive fusion; microblog sentiment analysis; CNN-BIGRU;
D O I
10.3390/fi14080234
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To address the shortcomings of existing deep learning models and the characteristics of microblog speech, we propose the DCCMM model to improve the effectiveness of microblog sentiment analysis. The model employs WOBERT Plus and ALBERT to dynamically encode character-level text and word-level text, respectively. Then, a convolution operation is used to extract local key features, while cross-channel feature fusion and multi-head self-attention pooling operations are used to extract global semantic information and filter out key data, before using the multi-granularity feature interaction fusion operation to effectively fuse character-level and word-level semantic information. Finally, the Softmax function is used to output the results. On the weibo_senti_100k dataset, the accuracy and F1 values of the DCCMM model improve by 0.84% and 1.01%, respectively, compared to the best-performing comparison model. On the SMP2020-EWECT dataset, the accuracy and F1 values of the DCCMM model improve by 1.22% and 1.80%, respectively, compared with the experimental results of the best-performing comparison model. The results showed that DCCMM outperforms existing advanced sentiment analysis models.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Emotion-semantic-enhanced bidirectional LSTM with multi-head attention mechanism for microblog sentiment analysis
    Wang S.
    Zhu Y.
    Gao W.
    Cao M.
    Li M.
    [J]. Zhu, Yonghua (zyh@shu.edu.cn), 1600, MDPI AG (11):
  • [32] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    [J]. JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [33] Aspect-Level Sentiment Analysis Based on Self-Attention and Graph Convolutional Network
    Chen K.
    Huang C.
    Lin H.
    [J]. Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2024, 47 (01): : 127 - 132
  • [34] Sentiment Analysis of Text Based on Bidirectional LSTM With Multi-Head Attention
    Long, Fei
    Zhou, Kai
    Ou, Weihua
    [J]. IEEE ACCESS, 2019, 7 : 141960 - 141969
  • [35] A Multi-tab Webpage Fingerprinting Method Based on Multi-head Self-attention
    Xie, Lixia
    Li, Yange
    Yang, Hongyu
    Hu, Ze
    Wang, Peng
    Cheng, Xiang
    Zhang, Liang
    [J]. FRONTIERS IN CYBER SECURITY, FCS 2023, 2024, 1992 : 131 - 140
  • [36] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [37] Lip Recognition Based on Bi-GRU with Multi-Head Self-Attention
    Ni, Ran
    Jiang, Haiyang
    Zhou, Lu
    Lu, Yuanyao
    [J]. ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, PT III, AIAI 2024, 2024, 713 : 99 - 110
  • [38] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    [J]. Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [39] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [40] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    [J]. COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119