Abstract Text Summarization with a Convolutional Seq2seq Model

被引:34
|
作者
Zhang, Yong [1 ]
Li, Dan [1 ]
Wang, Yuheng [1 ]
Fang, Yang [1 ]
Xiao, Weidong [1 ]
机构
[1] Natl Univ Def Technol, Sci & Technol Informat Syst Engn Lab, Changsha 410073, Hunan, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2019年 / 9卷 / 08期
关键词
abstract text summarization; convolutional neural network; Seq2seq model;
D O I
10.3390/app9081665
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
text summarization aims to offer a highly condensed and valuable information that expresses the main ideas of the text. Most previous researches focus on extractive models. In this work, we put forward a new generative model based on convolutional seq2seq architecture. A hierarchical CNN framework is much more efficient than the conventional RNN seq2seq models. We also equip our model with a copying mechanism to deal with the rare or unseen words. Additionally, we incorporate a hierarchical attention mechanism to model the keywords and key sentences simultaneously. Finally we verify our model on two real-life datasets, GigaWord and DUC corpus. The experiment results verify the effectiveness of our model as it outperforms state-of-the-art alternatives consistently and statistical significantly.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A Hierarchical Attention Seq2seq Model with CopyNet for Text Summarization
    Zhang, Yong
    Wang, Yuheng
    Liao, Jinzhi
    Xiao, Weidong
    2018 INTERNATIONAL CONFERENCE ON ROBOTS & INTELLIGENT SYSTEM (ICRIS 2018), 2018, : 316 - 320
  • [2] CFCSS : Based on CF Network Convolutional Seq2Seq Model for Abstractive Summarization
    Liang, Qingmin
    Lu, Ling
    Chang, Tianji
    Yang, Wu
    PROCEEDINGS OF THE 15TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA 2020), 2020, : 1160 - 1164
  • [3] Abstractive social media text summarization using selective reinforced Seq2Seq attention model
    Liang, Zeyu
    Du, Junping
    Li, Chaoyang
    NEUROCOMPUTING, 2020, 410 : 432 - 440
  • [4] Enhanced Seq2Seq Autoencoder via Contrastive Learning for Abstractive Text Summarization
    Zheng, Chujie
    Zhang, Kunpeng
    Wang, Harry Jiannan
    Fan, Ling
    Wang, Zhe
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 1764 - 1771
  • [5] A Chinese text corrector based on seq2seq model
    Gu, Sunyan
    Lang, Fei
    2017 INTERNATIONAL CONFERENCE ON CYBER-ENABLED DISTRIBUTED COMPUTING AND KNOWLEDGE DISCOVERY (CYBERC), 2017, : 322 - 325
  • [6] Abstractive Summarization Model with a Feature-Enhanced Seq2Seq Structure
    Hao, Zepeng
    Ji, Jingzhou
    Xie, Tao
    Xue, Bin
    2020 5TH ASIA-PACIFIC CONFERENCE ON INTELLIGENT ROBOT SYSTEMS (ACIRS 2020), 2020, : 163 - 167
  • [7] A Study on Hierarchical Text Classification as a Seq2seq Task
    Torba, Fatos
    Gravier, Christophe
    Laclau, Charlotte
    Kammoun, Abderrhammen
    Subercaze, Julien
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT III, 2024, 14610 : 287 - 296
  • [8] Seq2Seq models for recommending short text conversations
    Torres, Johnny
    Vaca, Carmen
    Teran, Luis
    Abad, Cristina L.
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 150
  • [9] Seq2Seq dynamic planning network for progressive text generation
    Wu, Di
    Cheng, Peng
    Zheng, Yuying
    COMPUTER SPEECH AND LANGUAGE, 2025, 89
  • [10] Residual Seq2Seq model for Building energy management
    Kim, Marie
    Kim, Nae-soo
    Song, YuJin
    Pyo, Cheol Sig
    2019 10TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY CONVERGENCE (ICTC): ICT CONVERGENCE LEADING THE AUTONOMOUS FUTURE, 2019, : 1126 - 1128