Multi-Modal Emotion Recognition for Online Education Using Emoji Prompts

被引:0
|
作者
Qin, Xingguo [1 ]
Zhou, Ya [1 ]
Li, Jun [1 ,2 ]
机构
[1] Guilin Univ Elect Technol, Sch Comp Sci & Informat Secur, Guilin 541004, Peoples R China
[2] Guangxi Key Lab Image & Graph Intelligent Proc, Guilin 541004, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 12期
基金
中国国家自然科学基金;
关键词
emotion recognition; emoji prompt; online education; multi-modal;
D O I
10.3390/app14125146
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Online education review data have strong statistical and predictive power but lack efficient and accurate analysis methods. In this paper, we propose a multi-modal emotion analysis method to analyze the online education of college students based on educational data. Specifically, we design a multi-modal emotion analysis method that combines text and emoji data, using pre-training emotional prompt learning to enhance the sentiment polarity. We also analyze whether this fusion model reflects the true emotional polarity. The conducted experiments show that our multi-modal emotion analysis method achieves good performance on several datasets, and multi-modal emotional prompt methods can more accurately reflect emotional expressions in online education data.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] DriveSense: A Multi-modal Emotion Recognition and Regulation System for a Car Driver
    Zhu, Lei
    Zhong, Zhinan
    Dai, Wan
    Chen, Yunfei
    Zhang, Yan
    Chen, Mo
    [J]. HCI IN MOBILITY, TRANSPORT, AND AUTOMOTIVE SYSTEMS, MOBITAS 2024, PT I, 2024, 14732 : 82 - 97
  • [42] EXTRACTING AND RECOGNISING MUSIC FEATURES THROUGH MULTI-MODAL EMOTION RECOGNITION
    Xu, Chi
    [J]. MECHATRONIC SYSTEMS AND CONTROL, 2024, 52 (03): : 140 - 146
  • [43] A novel transformer autoencoder for multi-modal emotion recognition with incomplete data
    Cheng, Cheng
    Liu, Wenzhe
    Fan, Zhaoxin
    Feng, Lin
    Jia, Ziyu
    [J]. Neural Networks, 2024, 172
  • [44] A Unified Biosensor–Vision Multi-Modal Transformer network for emotion recognition
    Ali, Kamran
    Hughes, Charles E.
    [J]. Biomedical Signal Processing and Control, 2025, 102
  • [45] A novel transformer autoencoder for multi-modal emotion recognition with incomplete data
    Cheng, Cheng
    Liu, Wenzhe
    Fan, Zhaoxin
    Feng, Lin
    Jia, Ziyu
    [J]. NEURAL NETWORKS, 2024, 172
  • [46] Advancements in EEG Emotion Recognition: Leveraging Multi-Modal Database Integration
    Roshdy, Ahmed
    Karar, Abdullah
    Al Kork, Samer
    Beyrouthy, Taha
    Nait-ali, Amine
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (06):
  • [47] Reserch of Multi-modal Emotion Recognition Based on Voice and Video Images
    Wang, Chuanyu
    Li, Weixiang
    Chen, Zhenhuan
    [J]. Computer Engineering and Applications, 2024, 57 (23) : 163 - 170
  • [48] MULTI-MODAL EMOTION RECOGNITION WITH SELF-GUIDED MODALITY CALIBRATION
    Hou, Mixiao
    Zhang, Zheng
    Lu, Guangming
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4688 - 4692
  • [49] Emotion recognition based on multi-modal physiological signals and transfer learning
    Fu, Zhongzheng
    Zhang, Boning
    He, Xinrun
    Li, Yixuan
    Wang, Haoyuan
    Huang, Jian
    [J]. FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [50] Using Big Data for Emotionally Intelligent Mobile Services Through Multi-modal Emotion Recognition
    Baimbetov, Yerzhan
    Khalil, Ismail
    Steinbauer, Matthias
    Anderst-Kotsis, Gabriele
    [J]. INCLUSIVE SMART CITIES AND E-HEALTH, 2015, 9102 : 127 - 138