Abstractive Text Summarization via Stacked LSTM

被引:0
|
作者
Siddhartha, Ireddy [1 ]
Zhan, Huixin [1 ]
Sheng, Victor S. [1 ]
机构
[1] Texas Tech Univ, Dept Comp Sci, Lubbock, TX 79409 USA
关键词
Summarization; Seq2seq; Stacked LSTM;
D O I
10.1109/CSCI54926.2021.00143
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the past, there have been many models proposed for text summarization via sequence to sequence training (seq2seq), attention mechanism, and transformers. Although these methods achieve an advance regarding the performance, these models fail to create a more complex feature representation of the current input and consequently gain inferior performance for modeling the long staggered sentences and modeling the complex inter-sentence dependencies. In order to address this issue, we utilize a more complex feature representation for summarization via stacked LSTM. In this case, the main reason for stacking LSTM is to allow for greater model complexity. For a simple encoder, we stack layers to create a hierarchical feature representation with attention. We generate the text summaries for any test text in terms of predicting the target sequence. With the proposed method, we achieve a better performance compared to the existing state-of-the-art phrase-based system on the task of text summarization on gigaword dataset. Furthermore, Experimental results on this dataset show that our framework performs well in terms of various ROUGE scores.
引用
收藏
页码:437 / 442
页数:6
相关论文
共 50 条
  • [1] An Optimized Abstractive Text Summarization Model Using Peephole Convolutional LSTM
    Rahman, Md Motiur
    Siddiqui, Fazlul Hasan
    [J]. SYMMETRY-BASEL, 2019, 11 (10):
  • [2] Abstractive text summarization using LSTM-CNN based deep learning
    Shengli Song
    Haitao Huang
    Tongxiao Ruan
    [J]. Multimedia Tools and Applications, 2019, 78 : 857 - 875
  • [3] Multi-layered attentional peephole convolutional LSTM for abstractive text summarization
    Rahman, Md Motiur
    Siddiqui, Fazlul Hasan
    [J]. ETRI JOURNAL, 2021, 43 (02) : 288 - 298
  • [4] Abstractive text summarization using LSTM-CNN based deep learning
    Song, Shengli
    Huang, Haitao
    Ruan, Tongxiao
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (01) : 857 - 875
  • [5] An approach to Abstractive Text Summarization
    Huong Thanh Le
    Tien Manh Le
    [J]. 2013 INTERNATIONAL CONFERENCE OF SOFT COMPUTING AND PATTERN RECOGNITION (SOCPAR), 2013, : 371 - 376
  • [6] Abstractive text summarization for Hungarian
    Yang, Zijian Gyozo
    Agocs, Adam
    Kusper, Gabor
    Varadi, Tamas
    [J]. ANNALES MATHEMATICAE ET INFORMATICAE, 2021, 53 : 299 - 316
  • [7] A Survey on Abstractive Text Summarization
    Moratanch, N.
    Chitrakala, S.
    [J]. PROCEEDINGS OF IEEE INTERNATIONAL CONFERENCE ON CIRCUIT, POWER AND COMPUTING TECHNOLOGIES (ICCPCT 2016), 2016,
  • [8] Survey on Abstractive Text Summarization
    Raphal, Nithin
    Duwarah, Hemanta
    Daniel, Philemon
    [J]. PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATION AND SIGNAL PROCESSING (ICCSP), 2018, : 513 - 517
  • [9] Abstractive Text Summarization Using Hybrid Technique of Summarization
    Liaqat, Muhammad Irfan
    Hamid, Isma
    Nawaz, Qamar
    Shafique, Nida
    [J]. 2022 14TH INTERNATIONAL CONFERENCE ON COMMUNICATION SOFTWARE AND NETWORKS (ICCSN 2022), 2022, : 141 - 144
  • [10] Dual Encoding for Abstractive Text Summarization
    Yao, Kaichun
    Zhang, Libo
    Du, Dawei
    Luo, Tiejian
    Tao, Lili
    Wu, Yanjun
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (03) : 985 - 996