A Self-attention Based LSTM Network for Text Classification

被引:16
|
作者
Jing, Ran [1 ]
机构
[1] Shandong Univ, Sch Control Sci & Engn, Jinan, Peoples R China
关键词
D O I
10.1088/1742-6596/1207/1/012008
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Neural networks have been used to achieve impressive performance in Natural Language Processing (NLP). Among all algorithms, RNN is a widely used architecture for text classification tasks. The main challenge in sentiment classification is the quantification of the connections between context words in a sentence. Even though various types and structures of model have been proposed, they encounter the problem of gradient vanishing and are unlikely to show the full potential of the network. In this work, we present a new RNN model based on the self-attention mechanism to improve the performance while dealing with long sentences and whole documents. Empirical results show that our model outperforms the state-of-art algorithms.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Multiple Positional Self-Attention Network for Text Classification
    Dai, Biyun
    Li, Jinlong
    Xu, Ruoyi
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7610 - 7617
  • [2] Dual-axial self-attention network for text classification
    Zhang, Xiaochuan
    Qiu, Xipeng
    Pang, Jianmin
    Liu, Fudong
    Li, Xingwei
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (12)
  • [3] Dual-axial self-attention network for text classification
    Xiaochuan Zhang
    Xipeng Qiu
    Jianmin Pang
    Fudong Liu
    Xingwei Li
    [J]. Science China Information Sciences, 2021, 64
  • [4] Dual-axial self-attention network for text classification
    Xiaochuan ZHANG
    Xipeng QIU
    Jianmin PANG
    Fudong LIU
    Xingwei LI
    [J]. Science China(Information Sciences), 2021, 64 (12) : 80 - 90
  • [5] Deformable Self-Attention for Text Classification
    Ma, Qianli
    Yan, Jiangyue
    Lin, Zhenxi
    Yu, Liuhong
    Chen, Zipeng
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 1570 - 1581
  • [6] Research on a Capsule Network Text Classification Method with a Self-Attention Mechanism
    Yu, Xiaodong
    Luo, Shun-Nain
    Wu, Yujia
    Cai, Zhufei
    Kuan, Ta-Wen
    Tseng, Shih-Pang
    [J]. SYMMETRY-BASEL, 2024, 16 (05):
  • [7] A text classification method based on LSTM and graph attention network
    Wang, Haitao
    Li, Fangbing
    [J]. CONNECTION SCIENCE, 2022, 34 (01) : 2466 - 2480
  • [8] Point cloud classification network based on self-attention mechanism
    Li, Yujie
    Cai, Jintong
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2022, 104
  • [9] Multi-Scale Self-Attention for Text Classification
    Guo, Qipeng
    Qiu, Xipeng
    Liu, Pengfei
    Xue, Xiangyang
    Zhang, Zheng
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7847 - 7854
  • [10] Image Classification based on Self-attention Convolutional Neural Network
    Cai, Xiaohong
    Li, Ming
    Cao, Hui
    Ma, Jingang
    Wang, Xiaoyan
    Zhuang, Xuqiang
    [J]. SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913