Siamese hierarchical attention networks for extractive summarization

被引:8
|
作者
Gonzalez, Jose-Angel [1 ]
Segarra, Encarna [1 ]
Garcia-Granada, Fernando [1 ]
Sanchis, Emilio [1 ]
Hurtado, Lluis-F [1 ]
机构
[1] Univ Politecn Valencia, Dept Sistemes Informat & Computacio, Cami Vera Sn, Valencia, Spain
关键词
Siamese neural networks; hierarchical attention networks; automatic text summarization; TEXT;
D O I
10.3233/JIFS-179011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present an extractive approach to document summarization based on Siamese Neural Networks. Specifically, we propose the use of Hierarchical Attention Networks to select the most relevant sentences of a text to make its summary. We train Siamese Neural Networks using document-summary pairs to determine whether the summary is appropriated for the document or not. By means of a sentence-level attention mechanism the most relevant sentences in the document can be identified. Hence, once the network is trained, it can be used to generate extractive summaries. The experimentation carried out using the CNN/DailyMail summarization corpus shows the adequacy of the proposal. In summary, we propose a novel end-to-end neural network to address extractive summarization as a binary classification problem which obtains promising results in-line with the state-of-the-art on the CNN/DailyMail corpus.
引用
收藏
页码:4599 / 4607
页数:9
相关论文
共 50 条
  • [1] Summarization of Spanish Talk Shows with Siamese Hierarchical Attention Networks
    Gonzalez, J-A
    Hurtado, L-E
    Segarra, E.
    Garcia-Granada, F.
    Sanchis, E.
    [J]. APPLIED SCIENCES-BASEL, 2019, 9 (18):
  • [2] Extractive summarization using siamese hierarchical transformer encoders
    Gonzalez, Jose Angel
    Segarra, Encarna
    Garcia-Granada, Fernando
    Sanchis, Emilio
    Hurtado, Lluis-F.
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 39 (02) : 2409 - 2419
  • [3] Applying Siamese Hierarchical Attention Neural Networks for multi-document summarization
    Angel Gonzalez, Jose
    Delonca, Julien
    Sanchis, Emilio
    Garcia-Granada, Fernando
    Segarra, Encarna
    [J]. PROCESAMIENTO DEL LENGUAJE NATURAL, 2019, (63): : 111 - 118
  • [4] CRHASum: extractive text summarization with contextualized-representation hierarchical-attention summarization network
    Diao, Yufeng
    Lin, Hongfei
    Yang, Liang
    Fan, Xiaochao
    Chu, Yonghe
    Wu, Di
    Zhang, Dongyu
    Xu, Kan
    [J]. NEURAL COMPUTING & APPLICATIONS, 2020, 32 (15): : 11491 - 11503
  • [5] CRHASum: extractive text summarization with contextualized-representation hierarchical-attention summarization network
    Yufeng Diao
    Hongfei Lin
    Liang Yang
    Xiaochao Fan
    Yonghe Chu
    Di Wu
    Dongyu Zhang
    Kan Xu
    [J]. Neural Computing and Applications, 2020, 32 : 11491 - 11503
  • [6] Deep hierarchical LSTM networks with attention for video summarization
    Lin, Jingxu
    Zhong, Sheng-hua
    Fares, Ahmed
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2022, 97
  • [7] Weakly Supervised Extractive Summarization with Attention
    Zhuang, Yingying
    Lu, Yichao
    Wang, Simi
    [J]. SIGDIAL 2021: 22ND ANNUAL MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE (SIGDIAL 2021), 2021, : 520 - 529
  • [8] Multi-granularity heterogeneous graph attention networks for extractive document summarization
    Zhao, Yu
    Wang, Leilei
    Wang, Cui
    Du, Huaming
    Wei, Shaopeng
    Feng, Huali
    Yu, Zongjian
    Li, Qing
    [J]. NEURAL NETWORKS, 2022, 155 : 340 - 347
  • [9] Extractive-Abstractive Summarization of Judgment Documents Using Multiple Attention Networks
    Gao, Yan
    Liu, Zhengtao
    Li, Juan
    Guo, Fan
    Xiao, Fei
    [J]. LOGIC AND ARGUMENTATION, CLAR 2021, 2021, 13040 : 486 - 494
  • [10] Extractive Document Summarization Based on Hierarchical GRU
    Zhang, Yong
    Liao, Jinzhi
    Tang, Jiuyang
    Xiao, Weidong
    Wang, Yuheng
    [J]. 2018 INTERNATIONAL CONFERENCE ON ROBOTS & INTELLIGENT SYSTEM (ICRIS 2018), 2018, : 341 - 346