A Multi-Task Learning Framework for Abstractive Text Summarization

被引:0
|
作者
Lu, Yao [1 ,3 ]
Liu, Linqing [3 ]
Jiang, Zhile
Yang, Min [4 ]
Goebel, Randy [1 ,2 ]
机构
[1] Univ Alberta, Alberta Machine Intelligence Inst, Edmonton, AB, Canada
[2] Univ Alberta, Dept Comp Sci, Edmonton, AB, Canada
[3] Univ Waterloo, David R Cheriton Sch Comp Sci, Waterloo, ON, Canada
[4] Chinese Acad Sci, Shenzhen Inst Adv Technol, Beijing, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a Multi-task learning approach for Abstractive Text Summarization (MATS), motivated by the fact that humans have no difficulty performing such task because they have the capabilities of multiple domains. Specifically, MATS consists of three components: (i) a text categorization model that learns rich category-specific text representations using a bi-LSTM encoder; (ii) a syntax labeling model that learns to improve the syntax-aware LSTM decoder; and (iii) an abstractive text summarization model that shares its encoder and decoder with the text categorization and the syntax labeling tasks, respectively. In particular, the abstractive text summarization model enjoys significant benefit from the additional text categorization and syntax knowledge. Our experimental results show that MATS outperforms the competitors.(1)
引用
收藏
页码:9987 / 9988
页数:2
相关论文
共 50 条
  • [1] Multi-Task Learning for Abstractive and Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    [J]. DATA SCIENCE AND ENGINEERING, 2019, 4 (01) : 14 - 23
  • [2] Multi-Task Learning for Abstractive and Extractive Summarization
    Yangbin Chen
    Yun Ma
    Xudong Mao
    Qing Li
    [J]. Data Science and Engineering, 2019, 4 (1) : 14 - 23
  • [3] Multi-task learning for abstractive text summarization with key information guide network
    Weiran Xu
    Chenliang Li
    Minghao Lee
    Chi Zhang
    [J]. EURASIP Journal on Advances in Signal Processing, 2020
  • [4] Multi-task learning for abstractive text summarization with key information guide network
    Xu, Weiran
    Li, Chenliang
    Lee, Minghao
    Zhang, Chi
    [J]. EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2020, 2020 (01)
  • [5] Multi-Task Learning for Cross-Lingual Abstractive Summarization
    Takase, Sho
    Okazaki, Naoaki
    [J]. 2022 Language Resources and Evaluation Conference, LREC 2022, 2022, : 3008 - 3016
  • [6] Multi-Task Learning for Cross-Lingual Abstractive Summarization
    Takase, Sho
    Okazaki, Naoaki
    [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 3008 - 3016
  • [7] Long Text Summarization and Key Information Extraction in a Multi-Task Learning Framework
    Lu, Ming
    Chen, Rongfa
    [J]. Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [8] Plausibility-promoting generative adversarial network for abstractive text summarization with multi-task constraint
    Yang, Min
    Wang, Xintong
    Lu, Yao
    Lv, Jianming
    Shen, Ying
    Li, Chengming
    [J]. INFORMATION SCIENCES, 2020, 521 : 46 - 61
  • [9] SummaReranker: A Multi-Task Mixture-of-Experts Re-ranking Framework for Abstractive Summarization
    Ravaut, Mathieu
    Joty, Shafiq
    Chen, Nancy F.
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4504 - 4524
  • [10] Optimization of the Abstract Text Summarization Model Based on Multi-Task Learning
    Yao, Ben
    Ding, Gejian
    [J]. PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON ELECTRONIC INFORMATION TECHNOLOGY AND COMPUTER ENGINEERING, EITCE 2023, 2023, : 424 - 428