Exploring transformer models for sentiment classification: A comparison of BERT, RoBERTa, ALBERT, DistilBERT, and XLNet

被引:0
|
作者
Areshey, Ali [1 ,2 ]
Mathkour, Hassan [2 ]
机构
[1] King Abdulaziz City Sci & Technol, Artificial Intelligence & Robot Inst, Riyadh, Saudi Arabia
[2] King Saud Univ, Coll Comp & Informat Sci, Dept Comp Sci, Riyadh, Saudi Arabia
关键词
ALBERT; BERT; DistilBERT; pre-training; RoBERTa; sentiment analysis; transfer learning; XLNet;
D O I
10.1111/exsy.13701
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transfer learning models have proven superior to classical machine learning approaches in various text classification tasks, such as sentiment analysis, question answering, news categorization, and natural language inference. Recently, these models have shown exceptional results in natural language understanding (NLU). Advanced attention-based language models like BERT and XLNet excel at handling complex tasks across diverse contexts. However, they encounter difficulties when applied to specific domains. Platforms like Facebook, characterized by continually evolving casual and sophisticated language, demand meticulous context analysis even from human users. The literature has proposed numerous solutions using statistical and machine learning techniques to predict the sentiment (positive or negative) of online customer reviews, but most of them rely on various business, review, and reviewer features, which leads to generalizability issues. Furthermore, there have been very few studies investigating the effectiveness of state-of-the-art pre-trained language models for sentiment classification in reviews. Therefore, this study aims to assess the effectiveness of BERT, RoBERTa, ALBERT, DistilBERT, and XLNet in sentiment classification using the Yelp reviews dataset. The models were fine-tuned, and the results obtained with the same hyperparameters are as follows: 98.30 for RoBERTa, 98.20 for XLNet, 97.40 for BERT, 97.20 for ALBERT, and 96.00 for DistilBERT.
引用
收藏
页数:27
相关论文
共 23 条
  • [1] Analyzing the Performance of Sentiment Analysis using BERT, DistilBERT, and RoBERTa
    Joshy, Archa
    Sundar, Sumod
    [J]. 2022 IEEE INTERNATIONAL POWER AND RENEWABLE ENERGY CONFERENCE, IPRECON, 2022,
  • [2] COMPARATIVE ANALYSES OF BERT, ROBERTA, DISTILBERT, AND XLNET FOR TEXT-BASED EMOTION RECOGNITION
    Adoma, Acheampong Francisca
    Henry, Nunoo-Mensah
    Chen, Wenyu
    [J]. 2020 17TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2020, : 117 - 121
  • [3] Beyond BERT: Exploring the Efficacy of RoBERTa and ALBERT in Supervised Multiclass Text Classification
    Sy, Christian Y.
    Maceda, Lany L.
    Canon, Mary Joy P.
    Flores, Nancy M.
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (03) : 223 - 233
  • [4] Leveraging Transfer learning techniques-BERT, RoBERTa, ALBERT and DistilBERT for Fake Review Detection
    Gupta, Priyanka
    Gandhi, Shriya
    Chakravarthi, Bharathi Raja
    [J]. FIRE 2021: PROCEEDINGS OF THE 13TH ANNUAL MEETING OF THE FORUM FOR INFORMATION RETRIEVAL EVALUATION, 2021, : 75 - 82
  • [5] A Comparative Sentiment Analysis of Greek Clinical Conversations Using BERT, RoBERTa, GPT-2, and XLNet
    Chatzimina, Maria Evangelia
    Papadaki, Helen A.
    Pontikoglou, Charalampos
    Tsiknakis, Manolis
    [J]. BIOENGINEERING-BASEL, 2024, 11 (06):
  • [6] Enhancing Misinformation Detection in Spanish Language with Deep Learning: BERT and RoBERTa Transformer Models
    Blanco-Fernández, Yolanda
    Otero-Vizoso, Javier
    Gil-Solla, Alberto
    García-Duque, Jorge
    [J]. Applied Sciences (Switzerland), 2024, 14 (21):
  • [7] Efficient Transformer Based Sentiment Classification Models
    Mathew L.
    Bindu V.R.
    [J]. Informatica (Slovenia), 2022, 46 (08): : 175 - 184
  • [8] Sentiment analysis classification system using hybrid BERT models
    Talaat, Amira Samy
    [J]. JOURNAL OF BIG DATA, 2023, 10 (01)
  • [9] Sentiment analysis classification system using hybrid BERT models
    Amira Samy Talaat
    [J]. Journal of Big Data, 10
  • [10] Comparison of SVM and Naive Bayes for Sentiment Classification using BERT data
    Rana, Shivani
    Kanji, Rakesh
    Jain, Shruti
    [J]. 2022 5TH INTERNATIONAL CONFERENCE ON MULTIMEDIA, SIGNAL PROCESSING AND COMMUNICATION TECHNOLOGIES (IMPACT), 2022,