Deep Pyramid Convolutional Neural Network Integrated with Self-attention Mechanism and Highway Network for Text Classification

被引:5
|
作者
Li, Xuewei [1 ]
Ning, Hongyun [1 ]
机构
[1] Tianjin Univ Technol, Tianjin Key Lab Intelligence Comp & New Software, Tianjin 300384, Peoples R China
关键词
D O I
10.1088/1742-6596/1642/1/012008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text classification is one of the basic tasks of natural language processing. In recent years, deep learning has been widely used in text classification tasks. The representative one is the convolutional neural network. DPCNN is a deep convolutional neural network text classification model that can obtain long-distance text information, but it focuses on the extraction of global features and ignores the extraction of local features of the text. Some local feature information is very important and plays an important role in text classification tasks. Therefore, in this paper, a self-attention mechanism is introduced to extract local features of text based on the lack of extracting local features of DPCNN. In addition, although the deep convolutional neural network can extract deeper features, it is easy to cause the problem of gradient disappearance during training. Therefore, the highway network is introduced to prevent the problem of gradient disappearance caused by network training and improve the performance of the model. Experimental results show that the proposed model is better than a single DPCNN model, which further improves the accuracy of text classification tasks.
引用
下载
收藏
页数:6
相关论文
共 50 条
  • [1] Image Classification based on Self-attention Convolutional Neural Network
    Cai, Xiaohong
    Li, Ming
    Cao, Hui
    Ma, Jingang
    Wang, Xiaoyan
    Zhuang, Xuqiang
    SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913
  • [2] Research on a Capsule Network Text Classification Method with a Self-Attention Mechanism
    Yu, Xiaodong
    Luo, Shun-Nain
    Wu, Yujia
    Cai, Zhufei
    Kuan, Ta-Wen
    Tseng, Shih-Pang
    SYMMETRY-BASEL, 2024, 16 (05):
  • [3] Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 453 - 467
  • [4] A Self-attention Based LSTM Network for Text Classification
    Jing, Ran
    2019 3RD INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2019), 2019, 1207
  • [5] Multiple Positional Self-Attention Network for Text Classification
    Dai, Biyun
    Li, Jinlong
    Xu, Ruoyi
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7610 - 7617
  • [6] Global and pyramid convolutional neural network with hybrid attention mechanism for hyperspectral image classification
    Wu, Linfeng
    Wang, Huajun
    GEOCARTO INTERNATIONAL, 2023, 38 (01)
  • [7] Missing well logs prediction using deep learning integrated neural network with the self-attention mechanism
    Wang, Jun
    Cao, Junxing
    Fu, Jingcheng
    Xu, Hanqing
    ENERGY, 2022, 261
  • [8] Learning model combining convolutional deep neural network with a self-attention mechanism for AC optimal power flow
    Tran, Quan
    Mitra, Joydeep
    Nguyen, Nga
    ELECTRIC POWER SYSTEMS RESEARCH, 2024, 231
  • [9] Dual-axial self-attention network for text classification
    Zhang, Xiaochuan
    Qiu, Xipeng
    Pang, Jianmin
    Liu, Fudong
    Li, Xingwei
    SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (12)
  • [10] An Integrated Convolutional Neural Network with a Fusion Attention Mechanism for Acoustic Scene Classification
    Jiang, Pengxu
    Xie, Yue
    Zou, Cairong
    Zhao, Li
    Wang, Qingyun
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2023, E106A (08) : 1057 - 1061