Harnessing Pre-Trained Neural Networks with Rules for Formality Style Transfer

被引:0
|
作者
Wang, Yunli [1 ]
Wu, Yu [2 ]
Mou, Lili [3 ]
Li, Zhoujun [1 ]
Chao, Wenhan [1 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing, Peoples R China
[2] Microsoft Res, Beijing, Peoples R China
[3] Univ Alberta, Dept Comp Sci, Edmonton, AB, Canada
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Formality text style transfer plays an important role in various NLP applications, such as non-native speaker assistants and child education. Early studies normalize informal sentences with rules, before statistical and neural models become a prevailing method in the field. While a rule-based system is still a common preprocessing step for formality style transfer in the neural era, it could introduce noise if we use the rules in a naive way such as data preprocessing. To mitigate this problem, we study how to harness rules into a state-of-the-art neural network that is typically pretrained on massive corpora. We propose three fine-tuning methods in this paper and achieve a new state-of-the-art on benchmark datasets.
引用
收藏
页码:3573 / 3578
页数:6
相关论文
共 50 条
  • [1] Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer
    Lai, Huiyuan
    Toral, Antonio
    Nissim, Malvina
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 484 - 494
  • [2] Side-Scan Sonar Image Classification Based on Style Transfer and Pre-Trained Convolutional Neural Networks
    Ge, Qiang
    Ruan, Fengxue
    Qiao, Baojun
    Zhang, Qian
    Zuo, Xianyu
    Dang, Lanxue
    ELECTRONICS, 2021, 10 (15)
  • [3] Transfer Learning based Performance Comparison of the Pre-Trained Deep Neural Networks
    Kumar, Jayapalan Senthil
    Anuar, Syahid
    Hassan, Noor Hafizah
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (01) : 797 - 805
  • [4] Teaming Up Pre-Trained Deep Neural Networks
    Deabes, Wael
    Abdel-Hakim, Alaa E.
    2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND INFORMATION SECURITY (ICSPIS), 2018, : 73 - 76
  • [5] Epistemic Uncertainty Quantification For Pre-trained Neural Networks
    Wang, Hanjing
    Ji, Qiang
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 11052 - 11061
  • [6] Transfer learning with pre-trained deep convolutional neural networks for serous cell classification
    Baykal, Elif
    Dogan, Hulya
    Ercin, Mustafa Emre
    Ersoz, Safak
    Ekinci, Murat
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (21-22) : 15593 - 15611
  • [7] Transfer learning with pre-trained deep convolutional neural networks for serous cell classification
    Elif Baykal
    Hulya Dogan
    Mustafa Emre Ercin
    Safak Ersoz
    Murat Ekinci
    Multimedia Tools and Applications, 2020, 79 : 15593 - 15611
  • [8] Pre-trained Convolutional Neural Networks for the Lung Sounds Classification
    Vaityshyn, Valentyn
    Porieva, Hanna
    Makarenkova, Anastasiia
    2019 IEEE 39TH INTERNATIONAL CONFERENCE ON ELECTRONICS AND NANOTECHNOLOGY (ELNANO), 2019, : 522 - 525
  • [9] ArtBank: Artistic Style Transfer with Pre-trained Diffusion Model and Implicit Style Prompt Bank
    Zhang, Zhanjie
    Zhang, Quanwei
    Xing, Wei
    Li, Guangyuan
    Zhao, Lei
    Sun, Jiakai
    Lan, Zehua
    Luan, Junsheng
    Huang, Yiling
    Lin, Huaizhong
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 7, 2024, : 7396 - 7404
  • [10] Exponential Discretization of Weights of Neural Network Connections in Pre-Trained Neural Networks
    Malsagov, M. Yu
    Khayrov, E. M.
    Pushkareva, M. M.
    Karandashev, I. M.
    OPTICAL MEMORY AND NEURAL NETWORKS, 2019, 28 (04) : 262 - 270