Guided Neural Language Generation for Abstractive Summarization using Abstract Meaning Representation

被引:0
|
作者
Hardy [1 ]
Vlachos, Andreas [1 ]
机构
[1] Univ Sheffield, Sheffield, S Yorkshire, England
基金
英国工程与自然科学研究理事会; 欧盟地平线“2020”;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent work on abstractive summarization has made progress with neural encoder-decoder architectures. However, such models are often challenged due to their lack of explicit semantic modeling of the source document and its summary. In this paper, we extend previous work on abstractive summarization using Abstract Meaning Representation (AMR) with a neural language generation stage which we guide using the source document. We demonstrate that this guidance improves summarization results by 7.4 and 10.5 points in ROUGE-2 using gold standard AMR parses and parses obtained from an off-the-shelf parser respectively. We also find that the summarization performance using the latter is 2 ROUGE-2 points higher than that of a well-established neural encoderdecoder approach trained on a larger dataset.
引用
收藏
页码:768 / 773
页数:6
相关论文
共 50 条
  • [21] Topic level summary generation using BERT induced Abstractive Summarization Model
    Ramina, Mayank
    Darnay, Nihar
    Ludbe, Chirag
    Dhruv, Ajay
    [J]. PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING AND CONTROL SYSTEMS (ICICCS 2020), 2020, : 747 - 752
  • [22] Robust Subgraph Generation Improves Abstract Meaning Representation Parsing
    Werling, Keenon
    Angeli, Gabor
    Manning, Christopher D.
    [J]. PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 982 - 991
  • [23] Abstractive Text Summarization Using Recurrent Neural Networks: Systematic Literature Review
    Ngoko, Israel Christian Tchouyaa
    Mukherjee, Amlan
    Kabaso, Boniface
    [J]. PROCEEDINGS OF THE 15TH INTERNATIONAL CONFERENCE ON INTELLECTUAL CAPITAL, KNOWLEDGE MANAGEMENT & ORGANISATIONAL LEARNING (ICICKM 2018), 2018, : 435 - 439
  • [24] Using Pre-Trained Language Models for Abstractive DBPEDIA Summarization: A Comparative Study
    Zahera, Hamada M.
    Vitiugin, Fedor
    Sherif, Mohamed Ahmed
    Castillo, Carlos
    Ngomo, Axel-Cyrille Ngonga
    [J]. KNOWLEDGE GRAPHS: SEMANTICS, MACHINE LEARNING, AND LANGUAGES, 2023, 56 : 19 - 37
  • [25] Semantic Summarization of Reconstructed Abstract Meaning Representation Graph Structure Based on Integer Linear Pragramming
    Chen H.
    Ming T.
    Liu S.
    Gao C.
    [J]. Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2019, 41 (07): : 1674 - 1681
  • [26] Semantic Summarization of Reconstructed Abstract Meaning Representation Graph Structure Based on Integer Linear Pragramming
    Chen Hongchang
    Ming Tuosiyu
    Liu Shuxin
    Gao Chao
    [J]. JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2019, 41 (07) : 1674 - 1681
  • [27] The NLP Techniques for Automatic Multi-article News Summarization Based on Abstract Meaning Representation
    Nagalavi, Deepa
    Hanumanthappa, M.
    [J]. EMERGING TRENDS IN EXPERT APPLICATIONS AND SECURITY, 2019, 841 : 253 - 260
  • [28] Abstractive Summarization of Korean Legal Cases using Pre-trained Language Models
    Yoon, Jiyoung
    Junaid, Muhammad
    Ali, Sajid
    Lee, Jongwuk
    [J]. PROCEEDINGS OF THE 2022 16TH INTERNATIONAL CONFERENCE ON UBIQUITOUS INFORMATION MANAGEMENT AND COMMUNICATION (IMCOM 2022), 2022,
  • [29] Contextualized Formula Search Using Abstract Meaning Representation
    Mansouri, Behrooz
    Oard, Douglas W.
    Zanibbi, Richard
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4329 - 4333
  • [30] Inductive Model Using Abstract Meaning Representation for Text Classification via Graph Neural Networks
    Ogawa, Takuro
    Saga, Ryosuke
    [J]. HUMAN INTERFACE AND THE MANAGEMENT OF INFORMATION, HIMI 2023, PT I, 2023, 14015 : 258 - 271