Text summarization based on multi-head self-attention mechanism and pointer network

被引:10
|
作者
Qiu, Dong [1 ,2 ]
Yang, Bing [1 ]
机构
[1] Chongqing Univ Posts & Telecommun, Sch Comp Sci & Technol, Chongqing 400065, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Coll Sci, Chongqing 400065, Peoples R China
关键词
Automatic text summarization; Pointer generation network; Multi-head self-attention mechanism;
D O I
10.1007/s40747-021-00527-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing text summarization methods mainly rely on the mapping between manually labeled standard summaries and the original text for feature extraction, often ignoring the internal structure and semantic feature information of the original document. Therefore, the text summary extracted by the existing model has the problems of grammatical structure errors and semantic deviation from the original text. This paper attempts to enhance the model's attention to the inherent feature information of the source text so that the model can more accurately identify the grammatical structure and semantic information of the document. Therefore, this paper proposes a model based on the multi-head self-attention mechanism and the soft attention mechanism. By introducing an improved multi-head self-attention mechanism in the model coding stage, the training model enables the correct summary syntax and semantic information to obtain higher weight, thereby making the generated summary more coherent and accurate. At the same time, the pointer network model is adopted, and the coverage mechanism is improved to solve out-of-vocabulary and repetitive problems when generating abstracts. This article uses CNN/DailyMail dataset to verify the model proposed in this article and uses the ROUGE indicator to evaluate the model. The experimental results show that the model in this article improves the quality of the generated summary compared with other models.
引用
收藏
页码:555 / 567
页数:13
相关论文
共 50 条
  • [1] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    [J]. Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [2] MS-Pointer Network: Abstractive Text Summary Based on Multi-Head Self-Attention
    Guo, Qian
    Huang, Jifeng
    Xiong, Naixue
    Wang, Pan
    [J]. IEEE ACCESS, 2019, 7 : 138603 - 138613
  • [3] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    [J]. 2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [4] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    [J]. PLOS ONE, 2024, 19 (06):
  • [5] Abstractive Text Summarization with Multi-Head Attention
    Li, Jinpeng
    Zhang, Chuang
    Chen, Xiaojun
    Cao, Yanan
    Liao, Pengcheng
    Zhang, Peng
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [6] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [7] Speech enhancement method based on the multi-head self-attention mechanism
    Chang, Xinxu
    Zhang, Yang
    Yang, Lin
    Kou, Jinqiao
    Wang, Xin
    Xu, Dongdong
    [J]. Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [8] Multi-head enhanced self-attention network for novelty detection
    Zhang, Yingying
    Gong, Yuxin
    Zhu, Haogang
    Bai, Xiao
    Tang, Wenzhong
    [J]. PATTERN RECOGNITION, 2020, 107
  • [9] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    [J]. COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119
  • [10] Adaptive Pruning for Multi-Head Self-Attention
    Messaoud, Walid
    Trabelsi, Rim
    Cabani, Adnane
    Abdelkefi, Fatma
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 48 - 57