On Learning Text Style Transfer with Direct Rewards

被引:0
|
作者
Liu, Yixin [1 ]
Neubig, Graham [1 ]
Wieting, John [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In most cases, the lack of parallel corpora makes it impossible to directly train supervised models for the text style transfer task. In this paper, we explore training algorithms that instead optimize reward functions that explicitly consider different aspects of the style-transferred outputs. In particular, we leverage semantic similarity metrics originally used for fine-tuning neural machine translation models to explicitly assess the preservation of content between system outputs and input texts. We also investigate the potential weaknesses of the existing automatic metrics and propose efficient strategies of using these metrics for training. The experimental results show that our model provides significant gains in both automatic and human evaluation over strong baselines, indicating the effectiveness of our proposed methods and training strategies.(1)
引用
收藏
页码:4262 / 4273
页数:12
相关论文
共 50 条
  • [1] Unsupervised Text Style Transfer Through Differentiable Back Translation and Rewards
    Bandyopadhyay, Dibyanayan
    Ekbal, Asif
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT IV, 2023, 13938 : 210 - 221
  • [2] Deep Learning for Text Style Transfer: A Survey
    Jin, Di
    Jin, Zhijing
    Hu, Zhiting
    Vechtomova, Olga
    Mihalcea, Rada
    [J]. COMPUTATIONAL LINGUISTICS, 2022, 48 (01) : 155 - 205
  • [3] Transductive Learning for Unsupervised Text Style Transfer
    Xiao, Fei
    Pang, Liang
    Lan, Yanyan
    Wang, Yan
    Shen, Huawei
    Cheng, Xueqi
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2510 - 2521
  • [4] A Review of Text Style Transfer Using Deep Learning
    Toshevska, Martina
    Gievska, Sonja
    [J]. IEEE Transactions on Artificial Intelligence, 2022, 3 (05): : 669 - 684
  • [5] Memory-enhanced text style transfer with dynamic style learning and calibration
    Fuqiang Lin
    Yiping Song
    Zhiliang Tian
    Wangqun Chen
    Diwen Dong
    Bo Liu
    [J]. Science China Information Sciences, 2024, 67
  • [6] Text Style Transfer via Learning Style Instance Supported Latent Space
    Yi, Xiaoyuan
    Liu, Zhenghao
    Li, Wenhao
    Sun, Maosong
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3801 - 3807
  • [7] Memory-enhanced text style transfer with dynamic style learning and calibration
    Fuqiang LIN
    Yiping SONG
    Zhiliang TIAN
    Wangqun CHEN
    Diwen DONG
    Bo LIU
    [J]. Science China(Information Sciences), 2024, 67 (04) : 181 - 196
  • [8] Memory-enhanced text style transfer with dynamic style learning and calibration
    Lin, Fuqiang
    Song, Yiping
    Tian, Zhiliang
    Chen, Wangqun
    Dong, Diwen
    Liu, Bo
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [9] A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer
    Luo, Fuli
    Li, Peng
    Zhou, Jie
    Yang, Pengcheng
    Chang, Baobao
    Sun, Xu
    Sui, Zhifang
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5116 - 5122
  • [10] Collaborative Learning of Bidirectional Decoders for Unsupervised Text Style Transfer
    Ma, Yun
    Chen, Yangbin
    Mao, Xudong
    Li, Qing
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9250 - 9266