Text Style Transfer via Learning Style Instance Supported Latent Space

被引:0
|
作者
Yi, Xiaoyuan [1 ,2 ,3 ]
Liu, Zhenghao [1 ,2 ,3 ]
Li, Wenhao [1 ]
Sun, Maosong [1 ,2 ,4 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing, Peoples R China
[2] Tsinghua Univ, Inst Artificial Intelligence, Beijing, Peoples R China
[3] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Beijing, Peoples R China
[4] Jiangsu Normal Univ, Jiangsu Collaborat Innovat Ctr Language Abil, Xuzhou, Jiangsu, Peoples R China
基金
新加坡国家研究基金会; 中国国家社会科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text style transfer pursues altering the style of a sentence while remaining its main content unchanged. Due to the lack of parallel corpora, most recent work focuses on unsupervised methods and has achieved noticeable progress. Nonetheless, the intractability of completely disentangling content from style for text leads to a contradiction of content preservation and style transfer accuracy. To address this problem, we propose a style instance supported method, StyIns. Instead of representing styles with embeddings or latent variables learned from single sentences, our model leverages the generative flow technique to extract underlying stylistic properties from multiple instances of each style, which form a more discriminative and expressive latent style space. By combining such a space with the attention-based structure, our model can better maintain the content and simultaneously achieve high transfer accuracy. Furthermore, the proposed method can be flexibly extended to semi-supervised learning so as to utilize available limited paired data. Experiments on three transfer tasks, sentiment modification, formality rephrasing, and poeticness generation, show that StyIns obtains a better balance between content and style, outperforming several recent baselines.
引用
收藏
页码:3801 / 3807
页数:7
相关论文
共 50 条
  • [1] Text Style Transfer: Leveraging a Style Classifier on Entangled Latent Representations
    Li, Xiaoyan
    Sun, Sun
    Wang, Yunli
    REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 72 - 82
  • [2] Latent Style: multi-style image transfer via latent style coding and skip connection
    Hu, Jingfei
    Wu, Guang
    Wang, Hua
    Zhang, Jicong
    SIGNAL IMAGE AND VIDEO PROCESSING, 2022, 16 (02) : 359 - 368
  • [3] Latent Style: multi-style image transfer via latent style coding and skip connection
    Jingfei Hu
    Guang Wu
    Hua Wang
    Jicong Zhang
    Signal, Image and Video Processing, 2022, 16 : 359 - 368
  • [4] Semi-supervised Text Style Transfer: Cross Projection in Latent Space
    Shang, Mingyue
    Li, Piji
    Fu, Zhenxin
    Bing, Lidong
    Zhao, Dongyan
    Shi, Shuming
    Yan, Rui
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4937 - 4946
  • [5] Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation
    Dai, Ning
    Liang, Jianze
    Qiu, Xipeng
    Huang, Xuanjing
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5997 - 6007
  • [6] Text Style Transfer via Optimal Transport
    Nouri, Nasim
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 2532 - 2541
  • [7] Deep Learning for Text Style Transfer: A Survey
    Jin, Di
    Jin, Zhijing
    Hu, Zhiting
    Vechtomova, Olga
    Mihalcea, Rada
    COMPUTATIONAL LINGUISTICS, 2022, 48 (01) : 155 - 205
  • [8] Transductive Learning for Unsupervised Text Style Transfer
    Xiao, Fei
    Pang, Liang
    Lan, Yanyan
    Wang, Yan
    Shen, Huawei
    Cheng, Xueqi
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2510 - 2521
  • [9] On Learning Text Style Transfer with Direct Rewards
    Liu, Yixin
    Neubig, Graham
    Wieting, John
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4262 - 4273
  • [10] Revision in Continuous Space: Unsupervised Text Style Transfer without Adversarial Learning
    Liu, Dayiheng
    Fu, Jie
    Zhang, Yidan
    Pal, Chris
    Lv, Jiancheng
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8376 - 8383