A Novel Method of Text Representation on Hybrid Neural Networks

被引:0
|
作者
Guo, Yanbu [1 ]
Jin, Chen [1 ]
Li, Weihua [1 ]
Ji, Chen [1 ]
Fang, Yuan [1 ]
Duan, Yunhao [1 ]
机构
[1] Yunnan Univ, Sch Informat Sci & Engn, Kunming, Yunnan, Peoples R China
基金
美国国家科学基金会;
关键词
Text analysis; text representation; Bidirectional Long Short-Term Neural Network; Convolutional Neural Network; word embedding;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Text representation is one of the fundamental problems in text analysis tasks. The key of text representation is to extract and express the semantic and syntax feature of texts. The order-sensitive sequence models based on neural networks have achieved great progress in text representation. Bidirectional Long Short-Term Memory (BiLSTM) Neural Networks, as an extension of Recurrent Neural Networks (RNN), not only can deal with variable-length texts, capture the long-term dependencies in texts, but also model the forward and backward sequence contexts. Moreover, typical neural networks, Convolutional Neural Networks (CNN), can extract more semantic and structural information from texts, because of their convolution and pooling operations. The paper proposes a hybrid model, which combines the BiLSTM with 2-dimensial convolution and 1-dimensial pooling operations. In other words, the model firstly captures the abstract representation vector of texts by the BiLSTM, and then extracts text semantic features by 2-dimensial convolutional and 1-dimensial pooling operations. Experiments on text classification tasks show that our method obtains preferable performances compared with the state-of-the-art models when applied on the MR1 sentence polarity dataset.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Biologically Plausible Learning of Text Representation with Spiking Neural Networks
    Bialas, Marcin
    Mironczuk, Marcin Michal
    Mandziuk, Jacek
    PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XVI, PT I, 2020, 12269 : 433 - 447
  • [2] Hybrid Deep Neural Networks for Industrial Text Scoring
    Nagappan, Sidharrth
    Goh, Hui-Ngo
    Lin, Amy Hui-Lan
    ADVANCES AND TRENDS IN ARTIFICIAL INTELLIGENCE: THEORY AND PRACTICES IN ARTIFICIAL INTELLIGENCE, 2022, 13343 : 695 - 706
  • [3] Topic Representation: A Novel Method of Tag Recommendation for Text
    Zhong, Shangru
    Lei, Kai
    Huang, Xiaohui
    Wu, Jincheng
    2017 IEEE 2ND INTERNATIONAL CONFERENCE ON BIG DATA ANALYSIS (ICBDA), 2017, : 671 - 676
  • [4] A hybrid method based on estimation of distribution algorithms to train convolutional neural networks for text categorization
    Grabiel Toledano-Lopez, Orlando
    Madera, Julio
    Gonzalez, Hector
    Simon-Cuevas, Alfredo
    PATTERN RECOGNITION LETTERS, 2022, 160 : 105 - 111
  • [5] New method of text representation model based on neural network
    Zeng S.-F.
    Zhang X.-Y.
    Du X.-F.
    Lu T.-B.
    1600, Editorial Board of Journal on Communications (38): : 86 - 98
  • [6] A Novel Feature Representation Method Based on Deep Neural Networks for Gear Fault Diagnosis
    Wang, Jinrui
    Li, Shunming
    Jiang, Xingxing
    Xin, Yu
    2017 PROGNOSTICS AND SYSTEM HEALTH MANAGEMENT CONFERENCE (PHM-HARBIN), 2017, : 324 - 329
  • [7] Designing a Better Data Representation for Deep Neural Networks and Text Classification
    Prusa, Joseph D.
    Khoshgoftaar, Taghi M.
    PROCEEDINGS OF 2016 IEEE 17TH INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION (IEEE IRI), 2016, : 411 - 416
  • [8] Train Neural Networks with a Hybrid Method That Incorporates a Novel Simulated Annealing Procedure
    Tsoulos, Ioannis G.
    Charilogis, Vasileios
    Tsalikakis, Dimitrios
    APPLIEDMATH, 2024, 4 (03): : 1143 - 1161
  • [9] Hybrid Framework of Convolution and Recurrent Neural Networks for Text Classification
    Lyu, Shengfei
    Liu, Jiaqi
    11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 313 - 320
  • [10] A neural network for text representation
    Keller, M
    Bengio, S
    ARTIFICIAL NEURAL NETWORKS: FORMAL MODELS AND THEIR APPLICATIONS - ICANN 2005, PT 2, PROCEEDINGS, 2005, 3697 : 667 - 672