Knowledge-based BERT word embedding fine-tuning for emotion recognition

被引:7
|
作者
Zhu, Zixiao [1 ]
Mao, Kezhi [2 ]
机构
[1] Nanyang Technol Univ, Inst Catastrophe Risk Management, Interdisciplinary Grad Programme, Singapore 639798, Singapore
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会;
关键词
Emotion recognition; Word embedding fine-tuning; BERT;
D O I
10.1016/j.neucom.2023.126488
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Emotion recognition has received considerable attention in recent years, with the popularity of social media. It is noted, however, that the state-of-the-art language models such as Bidirectional Encoder Representations from Transformers (BERT) may not produce the best performance in emotion recognition. We found the main cause of the problem is that the embedding of emotional words from the pre-trained BERT model may not exhibit high between-class difference and within-class similarity. While BERT model fine-tuning is a common practice when it is applied to specific tasks, this may not be practical in emotion recognition because most datasets are small and some texts are short and noisy, without containing much useful contextual information. In this paper, we propose to use the knowledge of emotion vocabulary to fine-tune embedding of emotional words. As a separate module independent of the embedding learning model, the fine-tuning model aims to produce emotional word embedding with improved within-class similarity and between-class difference. By combining the emotionally discriminative fine-tuned embedding with contextual information-rich embedding from pre-trained BERT model, the emotional features underlying the texts could be more effectively captured in the subsequent feature learning module, which in turn leads to improved emotion recognition performance. The knowledgebased word embedding fine-tuning model is tested on five datasets of emotion recognition, and the results and analysis demonstrate the effectiveness of the proposed method.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] IsoBN: Fine-Tuning BERT with Isotropic Batch Normalization
    Zhou, Wenxuan
    Lin, Bill Yuchen
    Ren, Xiang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14621 - 14629
  • [22] A Closer Look at How Fine-tuning Changes BERT
    Zhou, Yichu
    Srikumar, Vivek
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1046 - 1061
  • [23] Automated Intention Mining with Comparatively Fine-tuning BERT
    Sun, Xuan
    Li, Luqun
    Mercaldo, Francesco
    Yang, Yichen
    Santone, Antonella
    Martinelli, Fabio
    2021 5TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2021, 2021, : 157 - 162
  • [24] Interdisciplinary knowledge-based implicit emotion recognition
    Jiang, Jialin
    Wang, Xinzhi
    Luo, Xiangfeng
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2020, 32 (22):
  • [25] Improving Speech Emotion Recognition via Fine-tuning ASR with Speaker Information
    Ta, Bao Thang
    Nguyen, Tung Lam
    Dang, Dinh Son
    Le, Nhat Minh
    Do, Van Hai
    PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 1596 - 1601
  • [26] Research Paper Classification and Recommendation System based-on Fine-Tuning BERT
    Biswas, Dipto
    Gil, Joon-Min
    2023 IEEE 24TH INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE, IRI, 2023, : 295 - 296
  • [27] Research Paper Classification and Recommendation System based-on Fine-Tuning BERT
    Biswas, Dipto
    Gil, Joon-Min
    Proceedings - 2023 IEEE 24th International Conference on Information Reuse and Integration for Data Science, IRI 2023, 2023, : 295 - 296
  • [28] A GA-Based Approach to Fine-Tuning BERT for Hate Speech Detection
    Madukwe, Kosisochukwu Judith
    Gao, Xiaoying
    Xue, Bing
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 2821 - 2828
  • [29] Transfer Learning for Sentiment Analysis Using BERT Based Supervised Fine-Tuning
    Prottasha, Nusrat Jahan
    Sami, Abdullah As
    Kowsher, Md
    Murad, Saydul Akbar
    Bairagi, Anupam Kumar
    Masud, Mehedi
    Baz, Mohammed
    SENSORS, 2022, 22 (11)
  • [30] SelfCCL: Curriculum Contrastive Learning by Transferring Self-Taught Knowledge for Fine-Tuning BERT
    Dehghan, Somaiyeh
    Amasyali, Mehmet Fatih
    APPLIED SCIENCES-BASEL, 2023, 13 (03):