Emoji multimodal microblog sentiment analysis based on mutual attention mechanism

被引:1
|
作者
Lou, Yinxia [1 ]
Zhou, Junxiang [2 ]
Zhou, Jun [3 ]
Ji, Donghong [3 ]
Zhang, Qing [4 ]
机构
[1] Jianghan Univ, Sch Artificial Intelligence, Wuhan 430056, Peoples R China
[2] Shangqiu Normal Univ, Sch Informat Technol, Shangqiu 476000, Peoples R China
[3] Wuhan Univ, Sch Cyber Sci & Engn, Key Lab Aerosp Informat Secur & Trusted Comp, Minist Educ, Wuhan 430072, Peoples R China
[4] North China DEAN Power Engn Beijing Co Ltd, Beijing 100120, Peoples R China
来源
SCIENTIFIC REPORTS | 2024年 / 14卷 / 01期
关键词
Emoji; Mutual attention mechanism; Multimodal sentiment analysis; Multimodal fusion;
D O I
10.1038/s41598-024-80167-x
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Emojis, utilizing visual means, mimic human facial expressions and postures to convey emotions and opinions. They are widely used in social media platforms such as Sina Weibo, and have become a crucial feature for sentiment analysis. However, existing approaches often treat emojis as special symbols or convert them into text labels, thereby neglecting the rich visual information of emojis. We propose a novel multimodal information integration model for emoji microblog sentiment analysis. To effectively leverage the emoji visual information, the model employs a text-emoji visual mutual attention mechanism. Experiments on a manually annotated microblog dataset show that compared to the baseline models without incorporating emoji visual information, the proposed model achieves improvements of 1.37% in macro F1 score and 2.30% in accuracy, respectively. To facilitate the related research, our corpus will be publicly available at https://github.com/yx100/Emojis/blob/main/weibo-emojis-annotation.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] MULTIMODAL HYPERGRAPH LEARNING FOR MICROBLOG SENTIMENT PREDICTION
    Chen, Fuhai
    Gao, Yue
    Cao, Donglin
    Ji, Rongrong
    2015 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO (ICME), 2015,
  • [22] Enhancing aspect-based sentiment analysis with dependency-attention GCN and mutual assistance mechanism
    Jialin Feng
    Hong Li
    Zhiyi Yu
    Journal of Intelligent Information Systems, 2024, 62 : 163 - 189
  • [23] Enhancing aspect-based sentiment analysis with dependency-attention GCN and mutual assistance mechanism
    Feng, Jialin
    Li, Hong
    Yu, Zhiyi
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2024, 62 (01) : 163 - 189
  • [24] Sentiment tendency analysis based on semantic for microblog
    Tan, C., 1600, Centre for Environment Social and Economic Research, Post Box No. 113, Roorkee, 247667, India (51):
  • [25] Multimodal Sentiment Analysis Model Integrating Multi-features and Attention Mechanism
    Lyu X.
    Tian C.
    Zhang L.
    Du Y.
    Zhang X.
    Cai Z.
    Data Analysis and Knowledge Discovery, 2024, 8 (05) : 91 - 101
  • [26] Microblog Sentiment Analysis Model Based on Emoticons
    Pei, Shaojie
    Zhang, Lumin
    Li, Aiping
    WEB TECHNOLOGIES AND APPLICATIONS, APWEB 2014, PT II, 2014, 8710 : 127 - 135
  • [27] Microblog Sentiment Analysis Based on Paragraph Vectors
    Hu, Chengcheng
    Song, Xuliang
    JOURNAL OF COMPUTERS, 2016, 11 (01) : 83 - 90
  • [28] The Weighted Cross-Modal Attention Mechanism With Sentiment Prediction Auxiliary Task for Multimodal Sentiment Analysis
    Chen, Qiupu
    Huang, Guimin
    Wang, Yabing
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 2689 - 2695
  • [29] Attention fusion network for multimodal sentiment analysis
    Yuanyi Luo
    Rui Wu
    Jiafeng Liu
    Xianglong Tang
    Multimedia Tools and Applications, 2024, 83 : 8207 - 8217
  • [30] Sentiment Analysis Model Based on Structure Attention Mechanism
    Lin, Kai
    Lin, Dazhen
    Cao, Donglin
    ADVANCES IN COMPUTATIONAL INTELLIGENCE SYSTEMS, 2018, 650 : 17 - 27