Text summarization based on multi-head self-attention mechanism and pointer network

被引:10
|
作者
Qiu, Dong [1 ,2 ]
Yang, Bing [1 ]
机构
[1] Chongqing Univ Posts & Telecommun, Sch Comp Sci & Technol, Chongqing 400065, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Coll Sci, Chongqing 400065, Peoples R China
关键词
Automatic text summarization; Pointer generation network; Multi-head self-attention mechanism;
D O I
10.1007/s40747-021-00527-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing text summarization methods mainly rely on the mapping between manually labeled standard summaries and the original text for feature extraction, often ignoring the internal structure and semantic feature information of the original document. Therefore, the text summary extracted by the existing model has the problems of grammatical structure errors and semantic deviation from the original text. This paper attempts to enhance the model's attention to the inherent feature information of the source text so that the model can more accurately identify the grammatical structure and semantic information of the document. Therefore, this paper proposes a model based on the multi-head self-attention mechanism and the soft attention mechanism. By introducing an improved multi-head self-attention mechanism in the model coding stage, the training model enables the correct summary syntax and semantic information to obtain higher weight, thereby making the generated summary more coherent and accurate. At the same time, the pointer network model is adopted, and the coverage mechanism is improved to solve out-of-vocabulary and repetitive problems when generating abstracts. This article uses CNN/DailyMail dataset to verify the model proposed in this article and uses the ROUGE indicator to evaluate the model. The experimental results show that the model in this article improves the quality of the generated summary compared with other models.
引用
收藏
页码:555 / 567
页数:13
相关论文
共 50 条
  • [31] Text Sentiment Classification Based on BERT Embedding and Sliced Multi-Head Self-Attention Bi-GRU
    Zhang, Xiangsen
    Wu, Zhongqiang
    Liu, Ke
    Zhao, Zengshun
    Wang, Jinhao
    Wu, Chengqin
    [J]. SENSORS, 2023, 23 (03)
  • [32] Lip Recognition Based on Bi-GRU with Multi-Head Self-Attention
    Ni, Ran
    Jiang, Haiyang
    Zhou, Lu
    Lu, Yuanyao
    [J]. ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, PT III, AIAI 2024, 2024, 713 : 99 - 110
  • [33] Multi-modal multi-head self-attention for medical VQA
    Vasudha Joshi
    Pabitra Mitra
    Supratik Bose
    [J]. Multimedia Tools and Applications, 2024, 83 : 42585 - 42608
  • [34] Neural Linguistic Steganalysis via Multi-Head Self-Attention
    Jiao, Sai-Mei
    Wang, Hai-feng
    Zhang, Kun
    Hu, Ya-qi
    [J]. JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2021, 2021
  • [35] Personalized News Recommendation with CNN and Multi-Head Self-Attention
    Li, Aibin
    He, Tingnian
    Guo, Yi
    Li, Zhuoran
    Rong, Yixuan
    Liu, Guoqi
    [J]. 2022 IEEE 13TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2022, : 102 - 108
  • [36] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [37] An interactive multi-head self-attention capsule network model for aspect sentiment classification
    She, Lina
    Gong, Hongfang
    Zhang, Siyu
    [J]. JOURNAL OF SUPERCOMPUTING, 2024, 80 (07): : 9327 - 9352
  • [38] An interactive multi-head self-attention capsule network model for aspect sentiment classification
    Lina She
    Hongfang Gong
    Siyu Zhang
    [J]. The Journal of Supercomputing, 2024, 80 : 9327 - 9352
  • [39] GlobalMind: Global multi-head interactive self-attention network for hyperspectral change detection
    Hu, Meiqi
    Wu, Chen
    Zhang, Liangpei
    [J]. ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2024, 211 : 465 - 483
  • [40] MASPP and MWASP: multi-head self-attention based modules for UNet network in melon spot segmentation
    Tran, Khoa-Dang
    Ho, Trang-Thi
    Huang, Yennun
    Le, Nguyen Quoc Khanh
    Tuan, Le Quoc
    Ho, Van Lam
    [J]. JOURNAL OF FOOD MEASUREMENT AND CHARACTERIZATION, 2024, 18 (5) : 3935 - 3949