Text Classification Based on Neural Network Fusion

被引:1
|
作者
Kim, Deageon [1 ]
机构
[1] Dongseo Univ, Architectural Engn, 47 Jurye Ro, Busan 47011, South Korea
来源
TEHNICKI GLASNIK-TECHNICAL JOURNAL | 2023年 / 17卷 / 03期
基金
新加坡国家研究基金会;
关键词
attention mechanism; deep learning; neural network; presentation; text classification;
D O I
10.31803/tg-20221228154330
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The goal of text classification is to identify the category to which the text belongs. Text categorization is widely used in email detection, sentiment analysis, topic marking and other fields. However, good text representation is the point to improve the capability of NLP tasks. Traditional text representation adopts bag-of-words model or vector space model, which loses the context information of the text and faces the problems of high latitude and high sparsity,. In recent years, with the increase of data and the improvement of computing performance, the use of deep learning technology to represent and classify texts has attracted great attention. Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and RNN with attention mechanism are used to represent the text, and then to classify the text and other NLP tasks, all of which have better performance than the traditional methods. In this paper, we design two sentence-level models based on the deep network and the details are as follows: (1) Text representation and classification model based on bidirectional RNN and CNN (BRCNN). BRCNN's input is the word vector corresponding to each word in the sentence; after using RNN to extract word order information in sentences, CNN is used to extract higher-level features of sentences. After convolution, the maximum pool operation is used to obtain sentence vectors. At last, softmax classifier is used for classification. RNN can capture the word order information in sentences, while CNN can extract useful features. Experiments on eight text classification tasks show that BRCNN model can get better text feature representation, and the classification accuracy rate is equal to or higher than that of the prior art. (2) Attention mechanism and CNN (ACNN) model uses the RNN with attention mechanism to obtain the context vector; Then CNN is used to extract more advanced feature information. The maximum pool operation is adopted to obtain a sentence vector; At last, the softmax classifier is used to classify the text. Experiments on eight text classification benchmark data sets show that ACNN improves the stability of model convergence, and can converge to an optimal or local optimal solution better than BRCNN.
引用
收藏
页码:359 / 366
页数:8
相关论文
共 50 条
  • [1] A Patent Text Classification Model Based on Multivariate Neural Network Fusion
    Lu, Hongbiao
    Liu, Xiaobao
    Yin, Yanchao
    Chen, Zhicheng
    [J]. 2019 6TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2019), 2019, : 61 - 65
  • [2] Distributed resource spatial text classification based on multivariate neural network fusion
    Liu X.
    Lu H.
    Yin Y.
    Chen Z.
    [J]. Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2020, 26 (01): : 161 - 170
  • [3] Text classification based recurrent neural network
    Hu, Haojin
    Liao, Mengfan
    Zhang, Chao
    Jing, Yanmei
    [J]. PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020), 2020, : 657 - 660
  • [4] Knowledge based neural network for text classification
    Goyal, Ram Dayal
    [J]. GRC: 2007 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING, PROCEEDINGS, 2007, : 542 - 547
  • [5] Text Classification Based on Hybrid Neural Network
    Liu, Yapei
    Ma, Jianhong
    Tao, Yongcai
    Shi, Lei
    Wei, Lin
    Li, Linna
    [J]. 2020 IEEE 23RD INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE 2020), 2020, : 24 - 29
  • [6] Chinese text classification based on attention mechanism and feature-enhanced fusion neural network
    Jinbao Xie
    Yongjin Hou
    Yujing Wang
    Qingyan Wang
    Baiwei Li
    Shiwei Lv
    Yury I. Vorotnitsky
    [J]. Computing, 2020, 102 : 683 - 700
  • [7] Chinese text classification based on attention mechanism and feature-enhanced fusion neural network
    Xie, Jinbao
    Hou, Yongjin
    Wang, Yujing
    Wang, Qingyan
    Li, Baiwei
    Lv, Shiwei
    Vorotnitsky, Yury, I
    [J]. COMPUTING, 2020, 102 (03) : 683 - 700
  • [8] A Neural Network Based Text Classification with Attention Mechanism
    Lu SiChen
    [J]. PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 333 - 338
  • [9] Fault Text Classification Based on Convolutional Neural Network
    Wang, Lixia
    Zhang, Botao
    [J]. 2020 IEEE 7TH INTERNATIONAL CONFERENCE ON INDUSTRIAL ENGINEERING AND APPLICATIONS (ICIEA 2020), 2020, : 937 - 941
  • [10] Text Classification Method Based on Convolution Neural Network
    Li, Lin
    Xiao, Linlong
    Wang, Nanzhi
    Yang, Guocai
    Zhang, Jianwu
    [J]. PROCEEDINGS OF 2017 3RD IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATIONS (ICCC), 2017, : 1985 - 1989