DCCL: Dual-channel hybrid neural network combined with self-attention for text classification

被引:0
|
作者
Li, Chaofan [1 ,2 ]
Qiong, Liu [3 ]
Kai, Ma [4 ]
机构
[1] Nanjing Med Univ, Yancheng Sch Clin Med, Nanjing 224008, Jiangsu, Peoples R China
[2] Yancheng Third Peoples Hosp, Qual Management Div, Yancheng 224008, Jiangsu, Peoples R China
[3] Jiangsu Vocat Coll Med, Sch Med Imaging, Yancheng 224005, Jiangsu, Peoples R China
[4] Xuzhou Med Univ, Sch Med Informat & Engn, Xuzhou 221004, Jiangsu, Peoples R China
关键词
text classification; convolutional neural networks; long short-term memory networks; LSTM;
D O I
10.3934/mbe.2023091
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Text classification is a fundamental task in natural language processing. The Chinese text classification task suffers from sparse text features, ambiguity in word segmentation, and poor performance of classification models. A text classification model is proposed based on the self -attention mechanism combined with CNN and LSTM. The proposed model uses word vectors as input to a dual-channel neural network structure, using multiple CNNs to extract the N-Gram information of different word windows and enrich the local feature representation through the concatenation operation, the BiLSTM is used to extract the semantic association information of the context to obtain the high-level feature representation at the sentence level. The output of BiLSTM is feature weighted with self-attention to reduce the influence of noisy features. The outputs of the dual channels are concatenated and fed into the softmax layer for classification. The results of the multiple comparison experiments showed that the DCCL model obtained 90.07% and 96.26% F1-score on the Sougou and THUNews datasets, respectively. Compared to the baseline model, the improvement was 3.24% and 2.19%, respectively. The proposed DCCL model can alleviate the problem of CNN losing word order information and the gradient of BiLSTM when processing text sequences, effectively integrate local and global text features, and highlight key information. The classification performance of the DCCL model is excellent and suitable for text classification tasks.
引用
收藏
页码:1981 / 1992
页数:12
相关论文
共 50 条
  • [1] Dual-axial self-attention network for text classification
    Zhang, Xiaochuan
    Qiu, Xipeng
    Pang, Jianmin
    Liu, Fudong
    Li, Xingwei
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (12)
  • [2] Dual-axial self-attention network for text classification
    Xiaochuan Zhang
    Xipeng Qiu
    Jianmin Pang
    Fudong Liu
    Xingwei Li
    [J]. Science China Information Sciences, 2021, 64
  • [3] Dual-axial self-attention network for text classification
    Xiaochuan ZHANG
    Xipeng QIU
    Jianmin PANG
    Fudong LIU
    Xingwei LI
    [J]. Science China(Information Sciences), 2021, 64 (12) : 80 - 90
  • [4] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [5] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    [J]. Science China(Information Sciences), 2024, 67 (04) : 301 - 313
  • [6] A Self-attention Based LSTM Network for Text Classification
    Jing, Ran
    [J]. 2019 3RD INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2019), 2019, 1207
  • [7] Multiple Positional Self-Attention Network for Text Classification
    Dai, Biyun
    Li, Jinlong
    Xu, Ruoyi
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7610 - 7617
  • [8] DCAT: Combining Multisemantic Dual-Channel Attention Fusion for Text Classification
    Dong, Kaifang
    Liu, Yifan
    Xu, Fuyong
    Liu, Peiyu
    [J]. IEEE INTELLIGENT SYSTEMS, 2023, 38 (04) : 10 - 19
  • [9] Deep Pyramid Convolutional Neural Network Integrated with Self-attention Mechanism and Highway Network for Text Classification
    Li, Xuewei
    Ning, Hongyun
    [J]. 4TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE APPLICATIONS AND TECHNOLOGIES (AIAAT 2020), 2020, 1642
  • [10] Deformable Self-Attention for Text Classification
    Ma, Qianli
    Yan, Jiangyue
    Lin, Zhenxi
    Yu, Liuhong
    Chen, Zipeng
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 1570 - 1581