A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models

被引:23
|
作者
Zhang, Hanqing [1 ]
Song, Haolin [1 ]
Li, Shaoyu [1 ]
Zhou, Ming [2 ]
Song, Dawei [1 ]
机构
[1] Beijing Inst Technol, 5 South St, Beijing 100081, Peoples R China
[2] Langboat Technol, 52 Beisihuan West Rd, Beijing 100081, Peoples R China
关键词
Controllable text generation; pre-trained language models; Transformer; controllability; systematic review;
D O I
10.1145/3617680
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Controllable Text Generation (CTG) is an emerging area in the field of natural language generation (NLG). It is regarded as crucial for the development of advanced text generation technologies that better meet the specific constraints in practical applications. In recent years, methods using large-scale pre-trained language models (PLMs), in particular the widely used Transformer-based PLMs, have become a new paradigm of NLG, allowing generation of more diverse and fluent text. However, due to the limited level of interpretability of deep neural networks, the controllability of these methods needs to be guaranteed. To this end, controllable text generation using Transformer-based PLMs has become a rapidly growing yet challenging new research hotspot. A diverse range of approaches have emerged in the past 3 to 4 years, targeting different CTG tasks that require different types of controlled constraints. In this article, we present a systematic critical review on the common tasks, main approaches, and evaluation methods in this area. Finally, we discuss the challenges that the field is facing, and put forward various promising future directions. To the best of our knowledge, this is the first survey article to summarize the state-of-the-art CTG techniques from the perspective of Transformer-based PLMs. We hope it can help researchers and practitioners in the related fields to quickly track the academic and technological frontier, providing them with a landscape of the area and a roadmap for future research.
引用
收藏
页数:37
相关论文
共 50 条
  • [1] Pre-trained transformer-based language models for Sundanese
    Wilson Wongso
    Henry Lucky
    Derwin Suhartono
    [J]. Journal of Big Data, 9
  • [2] Pre-trained transformer-based language models for Sundanese
    Wongso, Wilson
    Lucky, Henry
    Suhartono, Derwin
    [J]. JOURNAL OF BIG DATA, 2022, 9 (01)
  • [3] Pre-Trained Language Models for Text Generation: A Survey
    Li, Junyi
    Tang, Tianyi
    Zhao, Wayne Xin
    Nie, Jian-Yun
    Wen, Ji-Rong
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (09)
  • [4] Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
    Agbesi, Victor Kwaku
    Chen, Wenyu
    Yussif, Sophyani Banaamwini
    Hossin, Md Altab
    Ukwuoma, Chiagoziem C.
    Kuadey, Noble A.
    Agbesi, Colin Collinson
    Samee, Nagwan Abdel
    Jamjoom, Mona M.
    Al-antari, Mugahed A.
    [J]. SYSTEMS, 2024, 12 (01):
  • [5] A survey of transformer-based multimodal pre-trained modals
    Han, Xue
    Wang, Yi-Tong
    Feng, Jun-Lan
    Deng, Chao
    Chen, Zhan-Heng
    Huang, Yu-An
    Su, Hui
    Hu, Lun
    Hu, Peng-Wei
    [J]. NEUROCOMPUTING, 2023, 515 : 89 - 106
  • [6] Automatic Question Generation using RNN-based and Pre-trained Transformer-based Models in Low Resource Indonesian Language
    Vincentio, Karissa
    Suhartono, Derwin
    [J]. INFORMATICA-AN INTERNATIONAL JOURNAL OF COMPUTING AND INFORMATICS, 2022, 46 (07): : 103 - 118
  • [7] Automatic Title Generation for Text with Pre-trained Transformer Language Model
    Mishra, Prakhar
    Diwan, Chaitali
    Srinivasa, Srinath
    Srinivasaraghavan, G.
    [J]. 2021 IEEE 15TH INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC 2021), 2021, : 17 - 24
  • [8] Non-Autoregressive Text Generation with Pre-trained Language Models
    Su, Yixuan
    Cai, Deng
    Wang, Yan
    Vandyke, David
    Baker, Simon
    Li, Piji
    Collier, Nigel
    [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 234 - 243
  • [9] A Systematic Review of Transformer-Based Pre-Trained Language Models through Self-Supervised Learning
    Kotei, Evans
    Thirunavukarasu, Ramkumar
    [J]. INFORMATION, 2023, 14 (03)
  • [10] Controllable Generation from Pre-trained Language Models via Inverse Prompting
    Zou, Xu
    Yin, Da
    Zhong, Qingyang
    Yang, Hongxia
    Yang, Zhilin
    Tang, Jie
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2450 - 2460