Enhancements of Attention-Based Bidirectional LSTM for Hybrid Automatic Text Summarization

被引:10
|
作者
Jiang, Jiawen [1 ]
Zhang, Haiyang [2 ]
Dai, Chenxu [1 ]
Zhao, Qingjuan [3 ]
Feng, Hao [1 ]
Ji, Zhanlin [1 ,4 ]
Ganchev, Ivan [4 ,5 ,6 ]
机构
[1] North China Univ Sci & Technol, Coll Artificial Intelligence, Dept Comp Sci, Tangshan 063009, Peoples R China
[2] Univ Sheffield, Dept Comp Sci, Sheffield S10 2TN, S Yorkshire, England
[3] Beihang Univ, Dept Comp Sci & Engn, Beijing 100191, Peoples R China
[4] Univ Limerick, Telecommun Res Ctr TRC, Limerick V94 T9PX, Ireland
[5] Plovdiv Univ Paisii Hilendarski, Dept Comp Syst, Plovdiv 4000, Bulgaria
[6] Bulgarian Acad Sci BAS, Inst Math & Informat IMI, Sofia 1040, Bulgaria
关键词
Decoding; Natural language processing; Data models; Computational modeling; Task analysis; Semantics; Encoding; Natural language processing (NLP); automatic text summarization (ATS); sequenceto-sequence (Seq2Seq) model; attention mechanism; bidirectional LSTM (Bi-LSTM); pointer network; coverage mechanism; mixed learning objective (MLO) function;
D O I
10.1109/ACCESS.2021.3110143
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The automatic generation of a text summary is a task of generating a short summary for a relatively long text document by capturing its key information. In the past, supervised statistical machine learning was widely used for the Automatic Text Summarization (ATS) task, but due to its high dependence on the quality of text features, the generated summaries lack accuracy and coherence, while the computational power involved, and performance achieved, could not easily meet the current needs. This paper proposes four novel ATS models with a Sequence-to-Sequence (Seq2Seq) structure, utilizing an attention-based bidirectional Long Short-Term Memory (LSTM), with added enhancements for increasing the correlation between the generated text summary and the source text, and solving the problem of out-of-vocabulary (OOV) words, suppressing the repeated words, and preventing the spread of cumulative errors in generated text summaries. Experiments conducted on two public datasets confirmed that the proposed ATS models achieve indeed better performance than the baselines and some of the state-of-the-art models considered.
引用
收藏
页码:123660 / 123671
页数:12
相关论文
共 50 条
  • [1] Text Summarization of Articles Using LSTM and Attention-Based LSTM
    Kumar, Harsh
    Kumar, Gaurav
    Singh, Shaivye
    Paul, Sourav
    [J]. MACHINE LEARNING AND AUTONOMOUS SYSTEMS, 2022, 269 : 133 - 145
  • [2] AB-LSTM: Attention-based Bidirectional LSTM Model for Scene Text Detection
    Liu, Zhandong
    Zhou, Wengang
    Li, Houqiang
    [J]. ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2019, 15 (04)
  • [3] Describing Video With Attention-Based Bidirectional LSTM
    Bin, Yi
    Yang, Yang
    Shen, Fumin
    Xie, Ning
    Shen, Heng Tao
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (07) : 2631 - 2641
  • [4] Attention-based bidirectional LSTM for Chinese punctuation prediction
    Li, Jinliang
    Yin, Chengfeng
    Jia, Zhen
    Li, Tianrui
    Tang, Min
    [J]. DATA SCIENCE AND KNOWLEDGE ENGINEERING FOR SENSING DECISION SUPPORT, 2018, 11 : 485 - 491
  • [5] Attention-based bidirectional LSTM for Chinese punctuation prediction
    Li, Jinliang
    Yin, Chengfeng
    Jia, Zhen
    Li, Tianrui
    Tang, Min
    [J]. DATA SCIENCE AND KNOWLEDGE ENGINEERING FOR SENSING DECISION SUPPORT, 2018, 11 : 708 - 714
  • [6] Automatic Misogyny Detection in Social Media Platforms using Attention-based Bidirectional-LSTM
    Rahali, Abir
    Akhloufi, Moulay A.
    Therien-Daniel, Anne-Marie
    Brassard-Gourdeau, Eloi
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 2706 - 2711
  • [7] Attention-based LSTM for Automatic Evaluation of Press Conferences
    Yi, Shengzhou
    Mochitomi, Koshiro
    Suzuki, Isao
    Wang, Xueting
    Yamasaki, Toshihiko
    [J]. THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 191 - 196
  • [8] Hybrid attention-based temporal convolutional bidirectional LSTM approach for wind speed interval prediction
    Bommidi, Bala Saibabu
    Kosana, Vishalteja
    Teeparthi, Kiran
    Madasthu, Santhosh
    [J]. ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH, 2023, 30 (14) : 40018 - 40030
  • [9] Outpatient Text Classification Using Attention-Based Bidirectional LSTM for Robot-Assisted Servicing in Hospital
    Chen, Che-Wen
    Tseng, Shih-Pang
    Kuan, Ta-Wen
    Wang, Jhing-Fa
    [J]. INFORMATION, 2020, 11 (02)
  • [10] Hybrid attention-based temporal convolutional bidirectional LSTM approach for wind speed interval prediction
    Bala Saibabu Bommidi
    Vishalteja Kosana
    Kiran Teeparthi
    Santhosh Madasthu
    [J]. Environmental Science and Pollution Research, 2023, 30 : 40018 - 40030