eMLM: A New Pre-training Objective for Emotion Related Tasks

被引:0
|
作者
Sosea, Tiberiu [1 ]
Caragea, Cornelia [1 ]
机构
[1] Univ Illinois, Comp Sci, Chicago, IL 60680 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bidirectional Encoder Representations from Transformers (BERT) have been shown to be extremely effective on a wide variety of natural language processing tasks, including sentiment analysis and emotion detection. However, the proposed pre-training objectives of BERT do not induce any sentiment or emotion-specific biases into the model. In this paper, we present Emotion Masked Language Modeling, a variation of Masked Language Modeling, aimed at improving the BERT language representation model for emotion detection and sentiment analysis tasks. Using the same pre-training corpora as the original BERT model, Wikipedia and BookCorpus, our BERT variation manages to improve the downstream performance on 4 tasks for emotion detection and sentiment analysis by an average of 1:2% F1. Moreover, Your approach shows an increased performance in our task-specific robustness tests. We make our code and pre-trained model available at https://github.com/tsosea2/eMLM.
引用
收藏
页码:286 / 293
页数:8
相关论文
共 50 条
  • [31] THE PRE-TRAINING SELECTION OF TEACHERS
    Barr, A. S.
    Douglas, Lois
    JOURNAL OF EDUCATIONAL RESEARCH, 1934, 28 (02): : 92 - 117
  • [32] Improving Fractal Pre-training
    Anderson, Connor
    Farrell, Ryan
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 2412 - 2421
  • [33] Pre-training phenotyping classifiers
    Dligach, Dmitriy
    Afshar, Majid
    Miller, Timothy
    JOURNAL OF BIOMEDICAL INFORMATICS, 2021, 113 (113)
  • [34] Rethinking Pre-training and Self-training
    Zoph, Barret
    Ghiasi, Golnaz
    Lin, Tsung-Yi
    Cui, Yin
    Liu, Hanxiao
    Cubuk, Ekin D.
    Le, Quoc V.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [35] Genetic algorithm introducing the estimation of optimal objective function values in subproblems by pre-training
    Iima, Hitoshi
    Hazama, Yohei
    NEURAL COMPUTING & APPLICATIONS, 2023,
  • [36] Radiological Reports Improve Pre-training for Localized Imaging Tasks on Chest X-Rays
    Mueller, Philip
    Kaissis, Georgios
    Zou, Congyu
    Rueckert, Daniel
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT V, 2022, 13435 : 647 - 657
  • [37] Pre-training Tasks for User Intent Detection and Embedding Retrieval in E-commerce Search
    Qiu, Yiming
    Zhao, Chenyu
    Zhang, Han
    Zhuo, Jingwei
    Li, Tianhao
    Zhang, Xiaowei
    Wang, Songlin
    Xu, Sulong
    Long, Bo
    Yang, Wen-Yun
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4424 - 4428
  • [38] Exploring complementary information of self-supervised pretext tasks for unsupervised video pre-training
    Zhou, Wei
    Hou, Yi
    Ouyang, Kewei
    Zhou, Shilin
    IET COMPUTER VISION, 2022, 16 (03) : 255 - 265
  • [39] Emotion-Aware Multimodal Pre-training for Image-Grounded Emotional Response Generation
    Tian, Zhiliang
    Wen, Zhihua
    Wu, Zhenghao
    Song, Yiping
    Tang, Jintao
    Li, Dongsheng
    Zhang, Nevin L.
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT III, 2022, : 3 - 19
  • [40] Masked self-supervised pre-training model for EEG-based emotion recognition
    Hu, Xinrong
    Chen, Yu
    Yan, Jinlin
    Wu, Yuan
    Ding, Lei
    Xu, Jin
    Cheng, Jun
    COMPUTATIONAL INTELLIGENCE, 2024, 40 (03)