A BERT-ABiLSTM Hybrid Model-Based Sentiment Analysis Method for Book Review

被引:0
|
作者
Wang, Peng [1 ]
Xiong, Xiong [2 ]
机构
[1] Shanghai Polytech Univ, Lib, Shanghai 200000, Peoples R China
[2] Zhejiang Intl Grp Co ltd, Hangzhou 310000, Peoples R China
关键词
Sentiment analysis; BiLSTM; BERT; book reviews; Attention mechanism; NEURAL-NETWORKS;
D O I
10.1142/S0218126624500397
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the problem of low accuracy rate of current sentiment analysis methods for book review texts, a book review sentiment analysis method based on BERT-ABiLSTM hybrid model is proposed. First, the overall framework of sentiment analysis is constructed by integrating sentiment vocabulary and deep learning methods, and the fine-grained sentiment analysis is divided into three stages: topic identification, sentiment identification and thematic sentiment identification. Then, a dynamic character-level word vector containing contextual information is generated using a bidirectional encoder representation from transformers (BERT) pre-trained language model. Then, the contextual information in the text data is fully learned by introducing the bidirectional long short-term memory (BiLSTM) model. Finally, the accurate analysis of book review sentiment is achieved by using Attention mechanism to highlight important features and improve the efficiency of resource utilization. Through an experimental comparison with existing advanced algorithms, the proposed method in this study has improved at least 4.2%, 3.9% and 3.79% in precision, recall and F1 values, respectively. The experimental results show that the proposed BERT-ABiLSTM is higher than the existing models under different metrics, indicating that the proposed model has a good application prospect in the fields of book review analysis and book recommendation.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] An Automatic Sentiment Analysis Method for Short Texts Based on Transformer-BERT Hybrid Model
    Xiao, Haiyan
    Luo, Linghua
    IEEE ACCESS, 2024, 12 : 93305 - 93317
  • [2] A Commodity Review Sentiment Analysis Based on BERT-CNN Model
    Dong, Junchao
    He, Feijuan
    Guo, Yunchuan
    Zhang, Huibing
    2020 5TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS (ICCCS 2020), 2020, : 143 - 147
  • [3] A novel framework for aspect based sentiment analysis using a hybrid BERT (HybBERT) model
    Goud, Anushree
    Garg, Bindu
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023,
  • [4] Sentiment recognition and analysis method of official document text based on BERT–SVM model
    Shule Hao
    Peng Zhang
    Sen Liu
    Yuhang Wang
    Neural Computing and Applications, 2023, 35 : 24621 - 24632
  • [5] Adaptive Thresholding for Sentiment Analysis Across Online Reviews Based on BERT Model BERT-based Adaptive Thresholding for Sentiment Analysis
    Lu, Zijie
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON MODELING, NATURAL LANGUAGE PROCESSING AND MACHINE LEARNING, CMNM 2024, 2024, : 70 - 75
  • [6] Sentiment analysis of Chinese stock reviews based on BERT model
    Li, Mingzheng
    Chen, Lei
    Zhao, Jing
    Li, Qiang
    APPLIED INTELLIGENCE, 2021, 51 (07) : 5016 - 5024
  • [7] Network Public Opinion Sentiment Analysis based on Bert Model
    Dong, Qian
    Sun, Tingting
    Xu, Yan
    Xu, Xuguang
    Zhong, Mei
    Yan, Kai
    2022 IEEE 10TH INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATION AND NETWORKS (ICICN 2022), 2022, : 662 - 666
  • [8] Sentiment analysis of Chinese stock reviews based on BERT model
    Mingzheng Li
    Lei Chen
    Jing Zhao
    Qiang Li
    Applied Intelligence, 2021, 51 : 5016 - 5024
  • [9] Transformer based Contextual Model for Sentiment Analysis of Customer Reviews: A Fine-tuned BERT A Sequence Learning BERT Model for Sentiment Analysis
    Durairaj, Ashok Kumar
    Chinnalagu, Anandan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (11) : 474 - 480
  • [10] Sentiment recognition and analysis method of official document text based on BERT-SVM model
    Hao, Shule
    Zhang, Peng
    Liu, Sen
    Wang, Yuhang
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (35): : 24621 - 24632