Pre-Trained Language Model-Based Deep Learning for Sentiment Classification of Vietnamese Feedback

被引:0
|
作者
Loc, Cu Vinh [1 ]
Viet, Truong Xuan [1 ]
Viet, Tran Hoang [1 ]
Thao, Le Hoang [1 ]
Viet, Nguyen Hoang [1 ]
机构
[1] Can Tho Univ, Software Ctr, Can Tho city, Vietnam
关键词
Sentiment analysis; PhoBERT; deep learning; text classification; Vietnamese feedback;
D O I
10.1142/S1469026823500165
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, with the strong and outstanding development of the Internet, the need to refer to the feedback of previous customers when shopping online is increasing. Therefore, websites are developed to allow users to share experiences, reviews, comments and feedback about the services and products of businesses and organizations. The organizations also collect user feedback about their products and services to give better directions. However, with a large amount of user feedback about certain services and products, it is difficult for users, businesses, and organizations to pay attention to them all. Thus, an automatic system is necessary to analyze the sentiment of a customer feedback. Recently, the well-known pre-trained language models for Vietnamese (PhoBERT) achieved high performance in comparison with other approaches. However, this method may not focus on the local information in the text like phrases or fragments. In this paper, we propose a Convolutional Neural Network (CNN) model based on PhoBERT for sentiment classification. The output of contextualized embeddings of the PhoBERT's last four layers is fed into the CNN. This makes the network capable of obtaining more local information from the sentiment. Besides, the PhoBERT output is also given to the transformer encoder layers in order to employ the self-attention technique, and this also makes the model more focused on the important information of the sentiment segments. The experimental results demonstrate that the proposed approach gives competitive performance compared to the existing studies on three public datasets with the opinions of Vietnamese people.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Incorporating Dynamic Semantics into Pre-Trained Language Model for Aspect-based Sentiment Analysis
    Zhang, Kai
    Zhang, Kun
    Zhang, Mengdi
    Zhao, Hongke
    Liu, Qi
    Wu, Wei
    Chen, Enhong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3599 - 3610
  • [22] An Entity-Level Sentiment Analysis of Financial Text Based on Pre-Trained Language Model
    Huang, Zhihong
    Fang, Zhijian
    2020 IEEE 18TH INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN), VOL 1, 2020, : 391 - 396
  • [23] Incorporating emoji sentiment information into a pre-trained language model for Chinese and English sentiment analysis
    Huang, Jiaming
    Li, Xianyong
    Li, Qizhi
    Du, Yajun
    Fan, Yongquan
    Chen, Xiaoliang
    Huang, Dong
    Wang, Shumin
    Li, Xianyong
    INTELLIGENT DATA ANALYSIS, 2024, 28 (06) : 1601 - 1625
  • [24] Leveraging Vision-Language Pre-Trained Model and Contrastive Learning for Enhanced Multimodal Sentiment Analysis
    An, Jieyu
    Zainon, Wan Mohd Nazmee Wan
    Ding, Binfen
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 37 (02): : 1673 - 1689
  • [25] Grammatical Error Correction by Transferring Learning Based on Pre-Trained Language Model
    Han M.
    Wang Y.
    Shanghai Jiaotong Daxue Xuebao/Journal of Shanghai Jiaotong University, 2022, 56 (11): : 1554 - 1560
  • [26] Learning and Evaluating a Differentially Private Pre-trained Language Model
    Hoory, Shlomo
    Feder, Amir
    Tendler, Avichai
    Cohen, Alon
    Erell, Sofia
    Laish, Itay
    Nakhost, Hootan
    Stemmer, Uri
    Benjamini, Ayelet
    Hassidim, Avinatan
    Matias, Yossi
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1178 - 1189
  • [27] Improved White Blood Cells Classification Based on Pre-trained Deep Learning Models
    Mohamed, Ensaf H.
    El-Behaidy, Wessam H.
    Khoriba, Ghada
    Li, Jie
    JOURNAL OF COMMUNICATIONS SOFTWARE AND SYSTEMS, 2020, 16 (01) : 37 - 45
  • [28] Hyperbolic Pre-Trained Language Model
    Chen, Weize
    Han, Xu
    Lin, Yankai
    He, Kaichen
    Xie, Ruobing
    Zhou, Jie
    Liu, Zhiyuan
    Sun, Maosong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3101 - 3112
  • [29] Classification and Analysis of Pistachio Species with Pre-Trained Deep Learning Models
    Singh, Dilbag
    Taspinar, Yavuz Selim
    Kursun, Ramazan
    Cinar, Ilkay
    Koklu, Murat
    Ozkan, Ilker Ali
    Lee, Heung-No
    ELECTRONICS, 2022, 11 (07)
  • [30] Pre-trained deep learning models for brain MRI image classification
    Krishnapriya, Srigiri
    Karuna, Yepuganti
    FRONTIERS IN HUMAN NEUROSCIENCE, 2023, 17