R-Transformer Network Based on Position and Self-Attention Mechanism for Aspect-Level Sentiment Classification

被引:11
|
作者
Zhou, Ziyu [1 ]
Liu, Fang'ai [1 ]
Wang, Qianqian [1 ]
机构
[1] Shandong Normal Univ, Sch Informat Sci & Engn, Jinan 25UU14, Shandong, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
基金
中国国家自然科学基金;
关键词
Aspect-level sentiment classification; Gaussian kernel; R-transformer; self-attention mechanism; NEURAL-NETWORK;
D O I
10.1109/ACCESS.2019.2938854
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aspect-level sentiment classification (ASC) is a research hotspot in natural language processing, which aims to infer the sentiment polarity of a particular aspect in an opinion sentence. There are three main influence factors in the aspect-level sentiment classification: the semantic information of the context; the interaction information of the context and aspect; the position information between the aspect and the context. Some researchers have proposed way to solve aspect-level sentiment classification. However, previous work mainly used the average vector of the aspect to calculate the attention score of the context, which introduced the influence of noise words. Moreover, these attention-based approaches simply used relative positions to calculate positional information for contextual and aspect terms and did not provided better semantic information. Based on these above questions, in this paper, we propose the PSRTN model. Firstly, obtaining the position-aware influence propagate between words and aspects by Gaussian kernel and generating the influence vector for each context word. Secondly, capturing global and local information of the context by the R-Transformer, and using the self-attention mechanism to obtain the keywords in the aspect. Finally, context representation of a particular aspect is generated for classification. In order to evaluate the validity of the model, we conduct experiments on SemEval2014 and Twitter. The results show that the accuracy of the PSRTN model can reach 83.8%, 80.9%, and 75.1% on three data sets, respectively.
引用
收藏
页码:127754 / 127764
页数:11
相关论文
共 50 条
  • [11] Capsule Network with Interactive Attention for Aspect-Level Sentiment Classification
    Du, Chunning
    Sun, Haifeng
    Wang, Jingyu
    Qi, Qi
    Liao, Jianxin
    Xu, Tong
    Liu, Ming
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 5489 - 5498
  • [12] Aspect-level sentiment classification based on location and hybrid multi attention mechanism
    Yuchen Wu
    Weijiang Li
    [J]. Applied Intelligence, 2022, 52 : 11539 - 11554
  • [13] Dependency Parsing and Attention Network for Aspect-Level Sentiment Classification
    Ouyang, Zhifan
    Su, Jindian
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, 2018, 11108 : 391 - 403
  • [14] Aspect-level sentiment classification based on location and hybrid multi attention mechanism
    Wu, Yuchen
    Li, Weijiang
    [J]. APPLIED INTELLIGENCE, 2022, 52 (10) : 11539 - 11554
  • [15] Position-Enhanced Multi-Head Self-Attention Based Bidirectional Gated Recurrent Unit for Aspect-Level Sentiment Classification
    Li, Xianyong
    Ding, Li
    Du, Yajun
    Fan, Yongquan
    Shen, Fashan
    [J]. FRONTIERS IN PSYCHOLOGY, 2022, 12
  • [16] Transformer-Based BiLSTM for Aspect-Level Sentiment Classification
    Cai, Tao
    Yu, Baocheng
    Xu, Wenxia
    [J]. 2021 4TH INTERNATIONAL CONFERENCE ON ROBOTICS, CONTROL AND AUTOMATION ENGINEERING (RCAE 2021), 2021, : 138 - 142
  • [17] Enhancing Attention-Based LSTM With Position Context for Aspect-Level Sentiment Classification
    Zeng, Jiangfeng
    Ma, Xiao
    Zhou, Ke
    [J]. IEEE ACCESS, 2019, 7 : 20462 - 20471
  • [18] A Position-aware Transformation Network for Aspect-level Sentiment Classification
    Jiang, Tao
    Wang, Jiahai
    Song, Youwei
    Rao, Yanghui
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [19] Structurally Enhanced Interactive Attention Network for Aspect-Level Sentiment Classification
    Liang, Chunfeng
    Fu, Yumeng
    Lv, Chengguo
    [J]. 2020 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2020), 2020, : 282 - 287
  • [20] Chinese text dual attention network for aspect-level sentiment classification
    Sun, Xinjie
    Liu, Zhifang
    Li, Hui
    Ying, Feng
    Tao, Yu
    [J]. PLOS ONE, 2024, 19 (03):