Latent Variables Improve Hard-Constrained Controllable Text Generation on Weak Correlation

被引:0
|
作者
Zhu, Weigang [1 ,2 ]
Liu, Xiaoming [1 ,2 ,4 ]
Yang, Guan [1 ,2 ]
Liu, Jie [3 ,4 ]
Qi, Haotian [1 ]
机构
[1] Zhongyuan Univ Technol, Sch Comp, Zhengzhou 451191, Henan, Peoples R China
[2] Zhengzhou Key Lab Text Proc & Image Understanding, Zhengzhou 450007, Henan, Peoples R China
[3] North China Univ Technol, Sch Informat Sci, Beijing 100144, Peoples R China
[4] Res Ctr Language Intelligence China, Beijing 100089, Peoples R China
基金
中国国家自然科学基金;
关键词
Latent variables; controllable text generation; weak correlation; hard constraint;
D O I
10.14569/IJACSA.2024.0150639
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hard-constrained controllable text generation aims to forcefully generate texts that contain specified constrained vocabulary, fulfilling the demands of more specialized application scenarios in comparison to soft constraint controllable text generation. However, in the presence of multiple weak correlation constraints in the constraint set, soft-constrained controllable models aggravate the constraint loss phenomenon, while the hard- constrained controllable models significantly suffer from quality degradation. To address this problem, a method for hard- constrained controllable text generation based on latent variables improving on weak correlations is proposed. The method utilizes latent variables to capture both global and local constraint correlation information to guide the language model to generate hard-constrained controllable text at the macro and micro levels, respectively. The introduction of latent variables not only reveals the latent correlation between constraints, but also helps the model to precisely satisfy these constraints while maintaining semantic coherence and logical correctness. Experiment findings reveal that under conditions of weak correlation hard constraints, the quality of text generation by the method proposed exceeds that of the currently established strong baseline models.
引用
收藏
页码:365 / 374
页数:10
相关论文
共 8 条
  • [1] MDM: Meta diffusion model for hard-constrained text generation
    Ke, Wenjun
    Guo, Yikai
    Liu, Qi
    Chen, Wanyi
    Wang, Peng
    Luo, Haoran
    Luo, Zhizhao
    KNOWLEDGE-BASED SYSTEMS, 2024, 283
  • [2] latent-GLAT: Glancing at Latent Variables for Parallel Text Generation
    Bao, Yu
    Zhou, Hao
    Huang, Shujian
    Wang, Dongqi
    Qian, Lihua
    Dai, Xinyu
    Chen, Jiajun
    Lin, Lei
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 8398 - 8409
  • [3] Unsupervised Controllable Generation of Diffusion Models with Latent Variables in VAEs
    Kim, Minju
    Kim, Seonggyeom
    Chae, Dong-Kyu
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2024, PT 3, 2025, 14852 : 495 - 504
  • [4] Text Generation Based on Generative Adversarial Nets with Latent Variables
    Wang, Heng
    Qin, Zengchang
    Wan, Tao
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2018, PT II, 2018, 10938 : 92 - 103
  • [5] Controllable Text Generation via Probability Density Estimation in the Latent Space
    Gu, Yuxuan
    Feng, Xiaocheng
    Ma, Sicheng
    Zhang, Lingyuan
    Gong, Heng
    Zhong, Weihong
    Qin, Bing
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 12590 - 12616
  • [6] End-to-end hard constrained text generation via incrementally predicting segments
    Nie, Jinran
    Huang, Xuancheng
    Liu, Yang
    Kong, Cunliang
    Liu, Xin
    Yang, Liner
    Yang, Erhong
    KNOWLEDGE-BASED SYSTEMS, 2023, 278
  • [7] TEXT-TO-CITY Controllable 3D Urban Block Generation with Latent Diffusion Model
    Zhuang, Junling
    Li, Guanhong
    Xu, Hang
    Xu, Jintu
    Tian, Runjia
    PROCEEDINGS OF THE 29TH INTERNATIONAL CONFERENCE OF THE ASSOCIATION FOR COMPUTER-AIDED ARCHITECTURAL DESIGN RESEARCH IN ASIA, CAADRIA 2024, VOL 2, 2024, : 169 - 178
  • [8] MacLaSa: Multi-Aspect Controllable Text Generation via Efficient Sampling from Compact Latent Space
    Ding, Hanxing
    Pang, Liang
    Wei, Zihao
    Shen, Huawei
    Cheng, Xueqi
    Chua, Tat-Seng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 4424 - 4436