Knowledge-based BERT word embedding fine-tuning for emotion recognition

被引:7
|
作者
Zhu, Zixiao [1 ]
Mao, Kezhi [2 ]
机构
[1] Nanyang Technol Univ, Inst Catastrophe Risk Management, Interdisciplinary Grad Programme, Singapore 639798, Singapore
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会;
关键词
Emotion recognition; Word embedding fine-tuning; BERT;
D O I
10.1016/j.neucom.2023.126488
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Emotion recognition has received considerable attention in recent years, with the popularity of social media. It is noted, however, that the state-of-the-art language models such as Bidirectional Encoder Representations from Transformers (BERT) may not produce the best performance in emotion recognition. We found the main cause of the problem is that the embedding of emotional words from the pre-trained BERT model may not exhibit high between-class difference and within-class similarity. While BERT model fine-tuning is a common practice when it is applied to specific tasks, this may not be practical in emotion recognition because most datasets are small and some texts are short and noisy, without containing much useful contextual information. In this paper, we propose to use the knowledge of emotion vocabulary to fine-tune embedding of emotional words. As a separate module independent of the embedding learning model, the fine-tuning model aims to produce emotional word embedding with improved within-class similarity and between-class difference. By combining the emotionally discriminative fine-tuned embedding with contextual information-rich embedding from pre-trained BERT model, the emotional features underlying the texts could be more effectively captured in the subsequent feature learning module, which in turn leads to improved emotion recognition performance. The knowledgebased word embedding fine-tuning model is tested on five datasets of emotion recognition, and the results and analysis demonstrate the effectiveness of the proposed method.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Robust and Consistent Estimation of Word Embedding for Bangla Language by fine-tuning Word2Vec Model
    Rahman, Rifat
    2020 23RD INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY (ICCIT 2020), 2020,
  • [32] Hierarchical BERT with an adaptive fine-tuning strategy for document classification
    Kong, Jun
    Wang, Jin
    Zhang, Xuejie
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [33] Boosting generalization of fine-tuning BERT for fake news detection
    Qin, Simeng
    Zhang, Mingli
    INFORMATION PROCESSING & MANAGEMENT, 2024, 61 (04)
  • [34] Fine-Tuning BERT on Twitter and Reddit Data in Luganda and English
    Kimera, Richard
    Rim, Daniela N.
    Choi, Heeyoul
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2023, 2023, : 63 - 70
  • [35] Dual-Objective Fine-Tuning of BERT for Entity Matching
    Peeters, Ralph
    Bizer, Christian
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2021, 14 (10): : 1913 - 1921
  • [36] Universal Image Embedding: Retaining and Expanding Knowledge With Multi-Domain Fine-Tuning
    Gkelios, Socratis
    Kastellos, Anestis
    Boutalis, Yiannis S. S.
    Chatzichristofis, Savvas A. A.
    IEEE ACCESS, 2023, 11 : 38208 - 38217
  • [37] Enhancing Multimodal Emotion Recognition through ASR Error Compensation and LLM Fine-Tuning
    Kyung, Jehyun
    Heo, Serin
    Chang, Joon-Hyuk
    INTERSPEECH 2024, 2024, : 4683 - 4687
  • [38] Investigation of BERT Model on Biomedical Relation Extraction Based on Revised Fine-tuning Mechanism
    Su, Peng
    Vijay-Shanker, K.
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 2522 - 2529
  • [39] Fine-Tuning Word Embeddings for Aspect-Based Sentiment Analysis
    Duc-Hong Pham
    Thi-Thanh-Tan Nguyen
    Anh-Cuong Le
    TEXT, SPEECH, AND DIALOGUE, TSD 2017, 2017, 10415 : 500 - 508
  • [40] Multi-phase Fine-Tuning: A New Fine-Tuning Approach for Sign Language Recognition
    Sarhan, Noha
    Lauri, Mikko
    Frintrop, Simone
    KUNSTLICHE INTELLIGENZ, 2022, 36 (01): : 91 - 98