A Survey of Sentiment Analysis Based on Pretrained Language Models

被引:5
|
作者
Sun, Kaili [1 ]
Luo, Xudong [1 ]
Luo, Michael Y. [2 ]
机构
[1] Guangxi Normal Univ, Sch Comp Sci & Engn, Guangxi Key Lab Multisource Informat Min & Secur, Guilin, Peoples R China
[2] Univ Cambridge, Emmanuel Coll, Cambridge, England
关键词
Pretrained language model; Sentiment Analysis; Cross language; BERT;
D O I
10.1109/ICTAI56018.2022.00188
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pretrained Language Models (PLMs) can be applied to downstream tasks with only fine-tuning, without learning the model from scratch. In particular, PLMs have been applied to Sentiment Analysis (SA), which detects, analyses, and extracts the polarity of the sentiment expressed in texts. To help researchers quickly grasp the state-of-art PLM-based SA, we survey PLM-based methods for mono-lingual and cross-lingual SA in this paper. Specifically, we brief these methods, compare their performance and point out the challenges for future research.
引用
收藏
页码:1239 / 1244
页数:6
相关论文
共 50 条
  • [1] A Survey of Pretrained Language Models
    Sun, Kaili
    Luo, Xudong
    Luo, Michael Y.
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, 2022, 13369 : 442 - 456
  • [2] Dense Text Retrieval Based on Pretrained Language Models: A Survey
    Zhao, Wayne Xin
    Liu, Jing
    Ren, Ruiyang
    Wen, Ji-Rong
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (04)
  • [3] FinSoSent: Advancing Financial Market Sentiment Analysis through Pretrained Large Language Models
    Delgadillo, Josiel
    Kinyua, Johnson
    Mutigwe, Charles
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (08)
  • [4] Pretrained Language Models for Text Generation: A Survey
    Li, Junyi
    Tang, Tianyi
    Zhao, Wayne Xin
    Wen, Ji-Rong
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 4492 - 4499
  • [5] AMMU: A survey of transformer-based biomedical pretrained language models
    Kalyan, Katikapalli Subramanyam
    Rajasekharan, Ajit
    Sangeetha, Sivanesan
    JOURNAL OF BIOMEDICAL INFORMATICS, 2022, 126
  • [6] Vietnamese Sentiment Analysis: An Overview and Comparative Study of Fine-tuning Pretrained Language Models
    Dang Van Thin
    Duong Ngoc Hao
    Ngan Luu-Thuy Nguyen
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (06)
  • [7] A Survey on Model Compression and Acceleration for Pretrained Language Models
    Xu, Canwen
    McAuley, Julian
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 10566 - 10575
  • [8] Sentiment and emotion analysis using pretrained deep learning models
    Davidson Kwamivi Aidam
    Ben-Bright Benuwa
    Stephen Opoku Oppong
    Edward Nwiah
    Journal of Data, Information and Management, 2024, 6 (3): : 277 - 295
  • [9] Language Recognition Based on Unsupervised Pretrained Models
    Yu, Haibin
    Zhao, Jing
    Yang, Song
    Wu, Zhongqin
    Nie, Yuting
    Zhang, Wei-Qiang
    INTERSPEECH 2021, 2021, : 3271 - 3275
  • [10] A Survey of Abstractive Text Summarization Utilising Pretrained Language Models
    Syed, Ayesha Ayub
    Gaol, Ford Lumban
    Boediman, Alfred
    Matsuo, Tokuro
    Budiharto, Widodo
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, PT I, 2022, 13757 : 532 - 544