Deep Recurrent Neural Model for Multi Domain Sentiment Analysis with Attention Mechanism

被引:0
|
作者
Khaled Hamed Alyoubi
Akashdeep Sharma
机构
[1] King Abdulaziz University,Faculty of Computing and Information Technology
[2] UIET,undefined
[3] Panjab University Chandigarh,undefined
来源
关键词
Sentiment analysis; Domain attention; LSTM; GRU; RNN;
D O I
暂无
中图分类号
学科分类号
摘要
The problem of multi-domain sentiment analysis is complex since meaning of words in different domains can be interpreted differently. This paper proposes a deep bi-directional Recurrent Neural Network based sentiment classification system employing attention mechanism for multi-domain classifications. The approach derives domain representation by extracting features related to description of domain from the text using bidirectional recurrent network with attention and feed it to the sentiment classifier along with the processed text using common hidden layers. We experiment with varied types of recurrent networks and propose that implementing the recurrent network with gated recurrent unit ensures that both domain-specific feature extraction and feature sharing for classification can be performed simultaneously and effectively. The evaluation of domain and sentiment modules has been conducted separately and results are encouraging. We found that using gated recurrent unit as bidirectional recurrent network in both modules gives efficient performance as it trains quickly and gives higher validation accuracy for all present domains. The proposed model also demonstrated good results for other metrics when compared with other similar state-of-the-art approaches.
引用
收藏
页码:43 / 60
页数:17
相关论文
共 50 条
  • [31] Deep Sentiment Analysis System with Attention Mechanism for the COVID-19 Vaccine
    Khalefa, Mustafa S.
    Al-Sulami, Zainab Amin
    Khalid, Eman Thabet
    Abduljabbar, Zaid Ameen
    Nyangaresi, Vincent Omollo
    Sibahee, Mustafa A. Al
    Ma, Junchao
    Abduljaleel, Iman Qays
    TEM JOURNAL-TECHNOLOGY EDUCATION MANAGEMENT INFORMATICS, 2024, 13 (02): : 1470 - 1480
  • [32] Attention-Based Bidirectional Gated Recurrent Unit Neural Networks for Sentiment Analysis
    Yu, Qing
    Zhao, Hui
    Wang, Zuohua
    2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND PATTERN RECOGNITION (AIPR 2019), 2019, : 116 - 119
  • [33] A Text Sentiment Analysis Model Based on Self-Attention Mechanism
    Ji, Likun
    Gong, Ping
    Yao, Zhuyu
    2019 THE 3RD INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPILATION, COMPUTING AND COMMUNICATIONS (HP3C 2019), 2019, : 33 - 37
  • [34] Integrating an Attention Mechanism and Deep Neural Network for Detection of DGA Domain Names
    Ren, Fangli
    Jiang, Zhengwei
    Liu, Jian
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 848 - 855
  • [35] Sentiment analysis and research based on two-channel parallel hybrid neural network model with attention mechanism
    Chen, Na
    Sun, Yanqiu
    Yan, Yan
    IET CONTROL THEORY AND APPLICATIONS, 2023, 17 (17): : 2259 - 2267
  • [36] Multi-head attention model for aspect level sentiment analysis
    Zhang, Xinsheng
    Gao, Teng
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 38 (01) : 89 - 96
  • [37] Recurrent Attention for Deep Neural Object Detection
    Symeonidis, Georgios
    Tefas, Anastasios
    10TH HELLENIC CONFERENCE ON ARTIFICIAL INTELLIGENCE (SETN 2018), 2018,
  • [38] A Multi-Hop Attention Deep Model for Aspect-Level Sentiment Classification
    Deng Y.
    Lei H.
    Li X.-Y.
    Lin Y.-O.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2019, 48 (05): : 759 - 766
  • [39] Sentiment Analysis with An Integrated Model of BERT and Bi-LSTM Based on Multi-Head Attention Mechanism
    Wang, Yahui
    Cheng, Xiaoqing
    Meng, Xuelei
    IAENG International Journal of Computer Science, 2023, 50 (01)
  • [40] Exploiting bi-directional deep neural networks for multi-domain sentiment analysis using capsule network
    Alireza Ghorbanali
    Mohammad Karim Sohrabi
    Multimedia Tools and Applications, 2023, 82 : 22943 - 22960