Knowledge-based BERT word embedding fine-tuning for emotion recognition

被引:7
|
作者
Zhu, Zixiao [1 ]
Mao, Kezhi [2 ]
机构
[1] Nanyang Technol Univ, Inst Catastrophe Risk Management, Interdisciplinary Grad Programme, Singapore 639798, Singapore
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会;
关键词
Emotion recognition; Word embedding fine-tuning; BERT;
D O I
10.1016/j.neucom.2023.126488
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Emotion recognition has received considerable attention in recent years, with the popularity of social media. It is noted, however, that the state-of-the-art language models such as Bidirectional Encoder Representations from Transformers (BERT) may not produce the best performance in emotion recognition. We found the main cause of the problem is that the embedding of emotional words from the pre-trained BERT model may not exhibit high between-class difference and within-class similarity. While BERT model fine-tuning is a common practice when it is applied to specific tasks, this may not be practical in emotion recognition because most datasets are small and some texts are short and noisy, without containing much useful contextual information. In this paper, we propose to use the knowledge of emotion vocabulary to fine-tune embedding of emotional words. As a separate module independent of the embedding learning model, the fine-tuning model aims to produce emotional word embedding with improved within-class similarity and between-class difference. By combining the emotionally discriminative fine-tuned embedding with contextual information-rich embedding from pre-trained BERT model, the emotional features underlying the texts could be more effectively captured in the subsequent feature learning module, which in turn leads to improved emotion recognition performance. The knowledgebased word embedding fine-tuning model is tested on five datasets of emotion recognition, and the results and analysis demonstrate the effectiveness of the proposed method.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] BERT-ERC: Fine-Tuning BERT Is Enough for Emotion Recognition in Conversation
    Qin, Xiangyu
    Wu, Zhiyu
    Zhang, Tingting
    Li, Yanran
    Luan, Jian
    Wang, Bin
    Wang, Li
    Cui, Jinshi
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 11, 2023, : 13492 - 13500
  • [2] Chinese Medical Named Entity Recognition based on Expert Knowledge and Fine-tuning Bert
    Zhang, Bofeng
    Yao, Xiuhong
    Li, Haiyan
    Aini, Mirensha
    2023 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH, ICKG, 2023, : 84 - 90
  • [3] KGWE: A Knowledge-guided Word Embedding Fine-tuning Model
    Kun, Kong Wei
    Racharak, Teeradaj
    Yiming, Cao
    Cheng, Peng
    Le Nguyen, Minh
    2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 1221 - 1225
  • [4] SPEECH RECOGNITION BY SIMPLY FINE-TUNING BERT
    Huang, Wen-Chin
    Wu, Chia-Hua
    Luo, Shang-Bao
    Chen, Kuan-Yu
    Wang, Hsin-Min
    Toda, Tomoki
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7343 - 7347
  • [5] Fine-Tuning BERT Model for Materials Named Entity Recognition
    Zhao, Xintong
    Greenberg, Jane
    An, Yuan
    Hu, Xiaohua Tony
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 3717 - 3720
  • [6] Emotion detection in psychological texts by fine-tuning BERT using emotion–cause pair extraction
    Kumar A.
    Jain A.K.
    International Journal of Speech Technology, 2022, 25 (03) : 727 - 743
  • [7] Jointly Fine-Tuning "BERT-like" Self Supervised Models to Improve Multimodal Speech Emotion Recognition
    Siriwardhana, Shamane
    Reis, Andrew
    Weerasekera, Rivindu
    Nanayakkara, Suranga
    INTERSPEECH 2020, 2020, : 3755 - 3759
  • [8] Emotion knowledge-based fine-grained facial expression recognition
    Zhu, Jiacheng
    Ding, Yu
    Liu, Hanwei
    Chen, Keyu
    Lin, Zhanpeng
    Hong, Wenxing
    NEUROCOMPUTING, 2024, 610
  • [9] BITS Pilani at SemEval-2024 Task 10: Fine-tuning BERT and Llama 2 for Emotion Recognition in Conversation
    Venkatesh, Dilip
    Prasanjith, Pasunti
    Sharma, Yashvardhan
    PROCEEDINGS OF THE 18TH INTERNATIONAL WORKSHOP ON SEMANTIC EVALUATION, SEMEVAL-2024, 2024, : 811 - 815
  • [10] Transfer fine-tuning of BERT with phrasal paraphrases
    Arase, Yuki
    Tsujii, Junichi
    COMPUTER SPEECH AND LANGUAGE, 2021, 66