Grasping Both Query Relevance and Essential Content for Query-focused Summarization

被引:0
|
作者
Xiong, Ye [1 ]
Kamigaito, Hidetaka [1 ]
Murakami, Soichiro [2 ]
Zhang, Peinan [2 ]
Takamura, Hiroya [1 ]
Okumura, Manabu [1 ]
机构
[1] Tokyo Inst Technol, Tokyo, Japan
[2] CyberAgent Inc, Tokyo, Japan
关键词
Query-focused summarization; Abstractive summarization;
D O I
10.1145/3626772.3657958
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Numerous effective methods have been developed to improve query-focused summarization (QFS) performance, e.g., pre-trained model-based and query-answer relevance-based methods. However, these methods still suffer from missing or redundant information due to the inability to capture and effectively utilize the interrelationship between the query and the source document, as well as between the source document and its generated summary, resulting in the summary being unable to answer the query or containing additional unrequired information. To mitigate this problem, we propose an end-to-end hierarchical two-stage summarization model, that first predicts essential content, and then generates a summary by emphasizing the predicted important sentences while maintaining separate encodings for the query and the source, so that it can comprehend not only the query itself but also the essential information in the source. We evaluated the proposed model on two QFS datasets, and the results indicated its overall effectiveness and that of each component.
引用
收藏
页码:2452 / 2456
页数:5
相关论文
共 50 条
  • [41] Improve Query Focused Abstractive Summarization by Incorporating Answer Relevance
    Su, Dan
    Yu, Tiezheng
    Fung, Pascale
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3124 - 3131
  • [42] Mutually Reinforced Manifold-Ranking Based Relevance Propagation Model for Query-Focused Multi-Document Summarization
    Cai, Xiaoyan
    Li, Wenjie
    IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2012, 20 (05): : 1597 - 1607
  • [43] Exploring actor-object relationships for query-focused multi-document summarization
    Valizadeh, Mohammadreza
    Brazdil, Pavel
    SOFT COMPUTING, 2015, 19 (11) : 3109 - 3121
  • [44] Domain Adaptation with Pre-trained Transformers for Query-Focused Abstractive Text Summarization
    Laskar, Md Tahmid Rahman
    Hoque, Enamul
    Huang, Jimmy Xiangji
    COMPUTATIONAL LINGUISTICS, 2022, 48 (02) : 279 - 320
  • [45] Query-focused multi-document summarization using hypergraph-based ranking
    Xiong, Shufeng
    Ji, Donghong
    INFORMATION PROCESSING & MANAGEMENT, 2016, 52 (04) : 670 - 681
  • [46] Long-Span Language Models for Query-Focused Unsupervised Extractive Text Summarization
    Singh, Mittul
    Mishra, Arunav
    Oualil, Youssef
    Berberich, Klaus
    Klakow, Dietrich
    ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 657 - 664
  • [47] Can Anaphora Resolution Improve Extractive Query-Focused Multi-Document Summarization?
    Lamsiyah, Salima
    El Mahdaouy, Abdelkader
    Schommer, Christoph
    IEEE ACCESS, 2023, 11 : 99961 - 99976
  • [48] Nonfactoid Question Answering as Query-Focused Summarization With Graph-Enhanced Multihop Inference
    Deng, Yang
    Zhang, Wenxuan
    Xu, Weiwen
    Shen, Ying
    Lam, Wai
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 11231 - 11245
  • [49] Co-HITS-Ranking Based Query-Focused Multi-document Summarization
    Hu, Po
    Ji, Donghong
    Teng, Chong
    INFORMATION RETRIEVAL TECHNOLOGY, 2010, 6458 : 121 - 130
  • [50] Unsupervised Query-Focused Multi-Document Summarization using the Cross Entropy Method
    Feigenblat, Guy
    Roitman, Haggai
    Boni, Odellia
    Konopnicki, David
    SIGIR'17: PROCEEDINGS OF THE 40TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2017, : 961 - 964