A Self-attention Based LSTM Network for Text Classification

被引:16
|
作者
Jing, Ran [1 ]
机构
[1] Shandong Univ, Sch Control Sci & Engn, Jinan, Peoples R China
关键词
D O I
10.1088/1742-6596/1207/1/012008
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Neural networks have been used to achieve impressive performance in Natural Language Processing (NLP). Among all algorithms, RNN is a widely used architecture for text classification tasks. The main challenge in sentiment classification is the quantification of the connections between context words in a sentence. Even though various types and structures of model have been proposed, they encounter the problem of gradient vanishing and are unlikely to show the full potential of the network. In this work, we present a new RNN model based on the self-attention mechanism to improve the performance while dealing with long sentences and whole documents. Empirical results show that our model outperforms the state-of-art algorithms.
引用
收藏
页数:5
相关论文
共 50 条
  • [31] Adaptive aggregation with self-attention network for gastrointestinal image classification
    Li, Sheng
    Cao, Jing
    Yao, Jiafeng
    Zhu, Jinhui
    He, Xiongxiong
    Jiang, Qianru
    IET IMAGE PROCESSING, 2022, 16 (09) : 2384 - 2397
  • [32] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [33] MEMRISTOR-BASED LSTM NETWORK FOR TEXT CLASSIFICATION
    Dou, Gang
    Zhao, Kaixuan
    Guo, Mei
    Mou, Jun
    FRACTALS-COMPLEX GEOMETRY PATTERNS AND SCALING IN NATURE AND SOCIETY, 2023, 31 (06)
  • [34] A Text Sentiment Analysis Model Based on Self-Attention Mechanism
    Ji, Likun
    Gong, Ping
    Yao, Zhuyu
    2019 THE 3RD INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPILATION, COMPUTING AND COMMUNICATIONS (HP3C 2019), 2019, : 33 - 37
  • [35] Electrocardiogram signal classification based on fusion method of residual network and self-attention mechanism
    Yuan C.
    Liu Z.
    Wang C.
    Yang F.
    Shengwu Yixue Gongchengxue Zazhi/Journal of Biomedical Engineering, 2023, 40 (03): : 474 - 481
  • [36] Bidirectional LSTM with self-attention mechanism and multi-channel features for sentiment classification
    Li, Weijiang
    Qi, Fang
    Tang, Ming
    Yu, Zhengtao
    NEUROCOMPUTING, 2020, 387 : 63 - 77
  • [37] A Self-attention Based Model for Offline Handwritten Text Recognition
    Nam Tuan Ly
    Trung Tan Ngo
    Nakagawa, Masaki
    PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 356 - 369
  • [38] Improved self-attention generative adversarial adaptation network-based melanoma classification
    Gowthami, S.
    Harikumar, R.
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 44 (03) : 4113 - 4122
  • [39] The function of the self-attention network
    Cunningham, Sheila J.
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22
  • [40] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210