CREATER: CTR-driven Advertising Text Generation with Controlled Pre-Training and Contrastive Fine-Tuning

被引:0
|
作者
Wei, Penghui [1 ]
Yang, Xuanhua [1 ]
Liu, Shaoguo [1 ]
Wang, Liang [1 ]
Zheng, Bo [1 ]
机构
[1] Alibaba Grp, Hangzhou, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper focuses on automatically generating the text of an ad, and the goal is that the generated text can capture user interest for achieving higher click-through rate (CTR). We propose CREATER,(1) a CTR-driven advertising text generation approach, to generate ad texts based on high-quality user reviews. To incorporate CTR objective, our model learns from online A/B test data with contrastive learning, which encourages the model to generate ad texts that obtain higher CTR. To alleviate the low-resource issue, we design a customized self-supervised objective reducing the gap between pre-training and fine-tuning. Experiments on industrial datasets show that CREATER significantly outperforms current approaches. It has been deployed online in a leading advertising platform and brings uplift on core online metrics.
引用
收藏
页码:9 / 17
页数:9
相关论文
共 50 条
  • [1] FactGen: Faithful Text Generation by Factuality-aware Pre-training and Contrastive Ranking Fine-tuning
    Lan Z.
    Li W.
    Su J.
    Xiao X.
    Liu J.
    Wu W.
    Lyu Y.
    Journal of Artificial Intelligence Research, 2023, 76 : 1281 - 1303
  • [2] FactGen: Faithful Text Generation by Factuality-aware Pre-training and Contrastive Ranking Fine-tuning
    Lan, Zhibin
    Li, Wei
    Su, Jinsong
    Xiao, Xinyan
    Liu, Jiachen
    Wu, Wenhao
    Lyu, Yajuan
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2023, 76 : 1281 - 1303
  • [3] Bridging the Gap between Pre-Training and Fine-Tuning for Commonsense Generation
    Yang, Haoran
    Wang, Yan
    Li, Piji
    Bi, Wei
    Lam, Wai
    Xu, Chen
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 376 - 383
  • [4] Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning
    Zhang, Jian-Guo
    Bui, Trung
    Yoon, Seunghyun
    Chen, Xiang
    Liu, Zhiwei
    Xia, Congying
    Tran, Quan Hung
    Chang, Walter
    Yu, Philip
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1906 - 1912
  • [5] CODE: Contrastive Pre-training with Adversarial Fine-Tuning for Zero-Shot Expert Linking
    Chen, Bo
    Zhang, Jing
    Zhang, Xiaokang
    Tang, Xiaobin
    Cai, Lingfan
    Chen, Hong
    Li, Cuiping
    Zhang, Peng
    Tang, Jie
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11846 - 11854
  • [6] SAR-HUB: Pre-Training, Fine-Tuning, and Explaining
    Yang, Haodong
    Kang, Xinyue
    Liu, Long
    Liu, Yujiang
    Huang, Zhongling
    REMOTE SENSING, 2023, 15 (23)
  • [7] AlignDet: Aligning Pre-training and Fine-tuning in Object Detection
    Li, Ming
    Wu, Jie
    Wang, Xionghui
    Chen, Chen
    Qin, Jie
    Xiao, Xuefeng
    Wang, Rui
    Zheng, Min
    Pan, Xin
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6843 - 6853
  • [8] Improved Fine-Tuning by Better Leveraging Pre-Training Data
    Liu, Ziquan
    Xu, Yi
    Xu, Yuanhong
    Qian, Qi
    Li, Hao
    Ji, Xiangyang
    Chan, Antoni B.
    Jin, Rong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Tri-Train: Automatic Pre-Fine Tuning between Pre-Training and Fine-Tuning for SciNER
    Zeng, Qingkai
    Yu, Wenhao
    Yu, Mengxia
    Jiang, Tianwen
    Weninger, Tim
    Jiang, Meng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4778 - 4787
  • [10] On the Connection between Pre-training Data Diversity and Fine-tuning Robustness
    Ramanujan, Vivek
    Nguyen, Thao
    Oh, Sewoong
    Schmidt, Ludwig
    Farhadi, Ali
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,