A novel abstractive summarization model based on topic-aware and contrastive learning

被引:0
|
作者
Tang, Huanling [1 ,3 ]
Li, Ruiquan [2 ]
Duan, Wenhao [2 ]
Dou, Quansheng [1 ,3 ]
Lu, Mingyu [4 ]
机构
[1] Shandong Technol & Business Univ, Sch Comp Sci & Technol, Yantai 264005, Shandong, Peoples R China
[2] Shandong Technol & Business Univ, Sch Informat & Elect Engn, Yantai 264005, Shandong, Peoples R China
[3] Shandong Coll & Univ Future Intelligent Comp, Coinnovat Ctr, Yantai 264005, Shandong, Peoples R China
[4] Dalian Maritime Univ, Informat Sci & Technol Coll, Dalian 116026, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
Abstractive summarization; Neural topic model; Contrastive learning; Seq2Seq model;
D O I
10.1007/s13042-024-02263-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The majority of abstractive summarization models are designed based on the Sequence-to-Sequence(Seq2Seq) architecture. These models are able to capture syntactic and contextual information between words. However, Seq2Seq-based summarization models tend to overlook global semantic information. Moreover, there exist inconsistency between the objective function and evaluation metrics of this model. To address these limitations, a novel model named ASTCL is proposed in this paper. It integrates the neural topic model into the Seq2Seq framework innovatively, aiming to capture the text's global semantic information and guide the summary generation. Additionally, it incorporates contrastive learning techniques to mitigate the discrepancy between the objective loss and the evaluation metrics through scoring multiple candidate summaries. On CNN/DM XSum and NYT datasets, the experimental results demonstrate that the ASTCL model outperforms the other generic models in summarization task.
引用
收藏
页码:5563 / 5577
页数:15
相关论文
共 50 条
  • [31] What is This Article About? Extreme Summarization with Topic-Aware Convolutional Neural Networks
    Narayan, Shashi
    Cohen, Shay B.
    Lapata, Mirella
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2019, 66 : 243 - 278
  • [32] A topic-aware classifier based on a hybrid quantum-classical model
    Maha A. Metawei
    Mohamed Taher
    Hesham ElDeeb
    Salwa M. Nassar
    Neural Computing and Applications, 2023, 35 : 18803 - 18812
  • [33] CoLRP: A Contrastive Learning Abstractive Text Summarization Method with ROUGE Penalty
    Tan, Caidong
    Sun, Xiao
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [34] Topic-aware latent models for representation learning on networks
    Celikkanat, Abdulkadir
    Malliaros, Fragkiskos D.
    PATTERN RECOGNITION LETTERS, 2021, 144 : 89 - 96
  • [35] TAAM: Topic-aware abstractive arabic text summarisation using deep recurrent neural networks
    Alahmadi, Dimah
    Wali, Arwa
    Alzahrani, Sarah
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (06) : 2651 - 2665
  • [36] A Novel Deep Learning Attention Based Sequence to Sequence Model for Automatic Abstractive Text Summarization
    Abd Algani Y.M.
    International Journal of Information Technology, 2024, 16 (6) : 3597 - 3603
  • [37] TopicVAE: Topic-aware Disentanglement Representation Learning for Enhanced Recommendation
    Guo, Zhiqiang
    Li, Guohui
    Li, Jianjun
    Chen, Huaicong
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022,
  • [38] Deep Reinforcement Learning-Based Approach to Tackle Topic-Aware Influence Maximization
    Tian, Shan
    Mo, Songsong
    Wang, Liwei
    Peng, Zhiyong
    DATA SCIENCE AND ENGINEERING, 2020, 5 (01) : 1 - 11
  • [39] Deep Reinforcement Learning-Based Approach to Tackle Topic-Aware Influence Maximization
    Shan Tian
    Songsong Mo
    Liwei Wang
    Zhiyong Peng
    Data Science and Engineering, 2020, 5 : 1 - 11
  • [40] T-REX: A Topic-Aware Relation Extraction Model
    Jung, Woohwan
    Shim, Kyuseok
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2073 - 2076