Long Text Summarization and Key Information Extraction in a Multi-Task Learning Framework

被引:0
|
作者
Lu, Ming [1 ]
Chen, Rongfa [1 ]
机构
[1] College of Management and Economy, Tianjin University, Tianjin,300072, China
关键词
Attention mechanisms - Empirical evaluations - Key information extraction - Learning frameworks - Long text summarization - Loss functions - Multitask learning - Text Summarisation - Text-based information - Training phasis;
D O I
10.2478/amns-2024-1659
中图分类号
学科分类号
摘要
In the context of the rapid advancement of big data and artificial intelligence, there has been an unprecedented surge in text-based information. This proliferation necessitates the development of efficient and accurate techniques for text summarization. This paper addresses this need by articulating the challenges associated with text summarization and key information extraction. We introduce a novel model that integrates multi-task learning with an attention mechanism to enhance the summarization and extraction of long texts. Furthermore, we establish a loss function for the model, calibrated against the discrepancy observed during the training phase. Empirical evaluations were conducted through simulated experiments after pre-processing the data via the proposed extraction model. These evaluations indicate that the model achieves optimal performance in the iterative training range of 55 to 65. When benchmarked against comparative models, our model demonstrates superior performance in extracting long text summaries and key information, evidenced by the metrics on the Daily Mail dataset (mean scores: 40.19, 16.42, 35.48) and the Gigaword dataset (mean scores: 34.38, 16.21, 31.38). Overall, the model developed in this study proves to be highly effective and practical in extracting long text summaries and key information, thereby significantly enhancing the efficiency of processing textual data. © 2024 Ming Lu et al., published by Sciendo.
引用
收藏
相关论文
共 50 条
  • [31] LiSum: Open Source Software License Summarization with Multi-Task Learning
    Li, Linyu
    Xu, Sihan
    Liu, Yang
    Gao, Ya
    Cai, Xiangrui
    Wu, Jiarun
    Song, Wenli
    Liu, Zheli
    [J]. 2023 38TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE, 2023, : 787 - 799
  • [32] A multi-task learning framework for end-to-end aspect sentiment triplet extraction
    Chen, Fang
    Yang, Zhongliang
    Huang, Yongfeng
    [J]. NEUROCOMPUTING, 2022, 479 : 12 - 21
  • [33] CMBEE: A constraint-based multi-task learning framework for biomedical event extraction
    Hu, Jingyue
    Tang, Buzhou
    Lyu, Nan
    He, Yuxin
    Xiong, Ying
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2024, 150
  • [34] Multi-Task Learning Using Shared and Task Specific Information
    Srijith, P. K.
    Shevade, Shirish
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 125 - 132
  • [35] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    [J]. Memetic Computing, 2020, 12 : 355 - 369
  • [36] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    [J]. MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [37] Focused multi-task learning in a Gaussian process framework
    Gayle Leen
    Jaakko Peltonen
    Samuel Kaski
    [J]. Machine Learning, 2012, 89 : 157 - 182
  • [38] A multi-task framework for metric learning with common subspace
    Yang, Peipei
    Huang, Kaizhu
    Liu, Cheng-Lin
    [J]. NEURAL COMPUTING & APPLICATIONS, 2013, 22 (7-8): : 1337 - 1347
  • [39] A multi-task framework for metric learning with common subspace
    Peipei Yang
    Kaizhu Huang
    Cheng-Lin Liu
    [J]. Neural Computing and Applications, 2013, 22 : 1337 - 1347
  • [40] Online Multi-Task Learning Framework for Ensemble Forecasting
    Xu, Jianpeng
    Tan, Pang-Ning
    Zhou, Jiayu
    Luo, Lifeng
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2017, 29 (06) : 1268 - 1280