Multi-level shared-weight encoding for abstractive sentence summarization

被引:0
|
作者
Lal, Daisy Monika [1 ]
Singh, Krishna Pratap [1 ]
Tiwary, Uma Shanker [2 ]
机构
[1] Machine Learning and Optimization Lab, Department of IT, IIIT Allahabad, Uttar Pradesh, Prayagraj,211012, India
[2] Speech Image and Language Processing Lab, Department of IT, IIIT Allahabad, Uttar Pradesh, Prayagraj,211012, India
关键词
Signal encoding - Encoding (symbols);
D O I
暂无
中图分类号
学科分类号
摘要
Features in a text are hierarchically structured and may not be optimally learned using one-step encoding. Scrutinizing the literature several times facilitates a better understanding of content and helps frame faithful context representations. The proposed model encapsulates the idea of re-examining a piece of text multiple times to grasp the underlying theme and aspects of English grammar before formulating a summary. We suggest a multi-level shared-weight encoder (MSE) that exclusively focuses on the sentence summarization task. MSE exercises a weight-sharing mechanism for proficiently regulating the multi-level encoding process. Weight-sharing helps recognize patterns left undiscovered by single level encoding strategy. We perform experiments with six encoding levels with weight sharing on the renowned short sentence summarization Gigaword and DUC2004 Task1 datasets. The experiments show that MSE generates a more readable(fluent) summary (Rouge-L score) as compared to multiple benchmark models while preserving similar levels of informativeness (Rouge-1 and Rouge-2 scores). Moreover, human evaluation of the generated abstracts also corroborates these assertions of enhanced readability. © 2021, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.
引用
收藏
页码:2965 / 2981
相关论文
共 50 条
  • [21] ALS-MRS: Incorporating aspect-level sentiment for abstractive multi-review summarization
    Zhao, Qingjuan
    Niu, Jianwei
    Liu, Xuefeng
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [22] ALS-MRS: Incorporating aspect-level sentiment for abstractive multi-review summarization
    Zhao, Qingjuan
    Niu, Jianwei
    Liu, Xuefeng
    [J]. Knowledge-Based Systems, 2022, 258
  • [23] Multi-Level Sequential Pattern Mining Based on Prime Encoding
    Sun Lianglei
    Li Yun
    Yin Jiang
    [J]. 2010 INTERNATIONAL COLLOQUIUM ON COMPUTING, COMMUNICATION, CONTROL, AND MANAGEMENT (CCCM2010), VOL I, 2010, : 458 - 461
  • [24] Multi-Level Sequential Pattern Mining Based on Prime Encoding
    Sun Lianglei
    Li Yun
    Yin Jiang
    [J]. INTERNATIONAL CONFERENCE ON APPLIED PHYSICS AND INDUSTRIAL ENGINEERING 2012, PT C, 2012, 24 : 1749 - 1756
  • [25] MSG-ATS: Multi-Level Semantic Graph for Arabic Text Summarization
    Salam, Mustafa Abdul
    Aldawsari, Mohamed
    Gamal, Mostafa
    Hamed, Hesham F. A.
    Sweidan, Sara
    [J]. IEEE ACCESS, 2024, 12 : 118773 - 118784
  • [26] Exploring the multi-level influence of shared leadership on workplace spirituality in teams
    Prabhu, Nandan
    Modem, Roopa
    [J]. INTERNATIONAL JOURNAL OF ORGANIZATIONAL ANALYSIS, 2023, 31 (06) : 2059 - 2080
  • [27] Multi-level Shared Knowledge Guided Learning for Knowledge Graph Completion
    Shan, Yongxue
    Zhou, Jie
    Peng, Jie
    Zhou, Xin
    Yin, Jiaqian
    Wang, Xiaodong
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 1027 - 1042
  • [28] Multi-level weighted sequential pattern mining based on prime encoding
    Li Y.
    Sun L.
    Yin J.
    Bao W.
    Gu M.
    [J]. International Journal of Digital Content Technology and its Applications, 2010, 4 (09) : 8 - 16
  • [29] Image Encoding Using Multi-Level DNA Barcodes with Nanopore Readout
    Zhu, Jinbo
    Ermann, Niklas
    Chen, Kaikai
    Keyser, Ulrich F.
    [J]. SMALL, 2021, 17 (28)
  • [30] Neuron inspired data encoding memristive multi-level memory cell
    Aidana Irmanova
    Alex Pappachen James
    [J]. Analog Integrated Circuits and Signal Processing, 2018, 95 : 429 - 434