Unified pre-training for program understanding and generation

被引:0
|
作者
Ahmad, Wasi Uddin [1 ]
Chakraborty, Saikat [2 ]
Ray, Baishakhi [2 ]
Chang, Kai-Wei [1 ]
机构
[1] University of California, Los Angeles, United States
[2] Columbia University, United States
来源
arXiv | 2021年
关键词
Broad spectrum - Code translation - Language generation - Legacy code - Natural languages - Pre-training - Program generation - Program understanding - Sequence models - Summarization and generations;
D O I
暂无
中图分类号
学科分类号
摘要
57
引用
收藏
相关论文
共 50 条
  • [1] Unified Pre-training for Program Understanding and Generation
    Ahmad, Wasi Uddin
    Chakraborty, Saikat
    Ray, Baishakhi
    Chang, Kai-Wei
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2655 - 2668
  • [2] Unified Language Model Pre-training for Natural Language Understanding and Generation
    Dong, Li
    Yang, Nan
    Wang, Wenhui
    Wei, Furu
    Liu, Xiaodong
    Wang, Yu
    Gao, Jianfeng
    Zhou, Ming
    Hon, Hsiao-Wuen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [3] Unified Dialog Model Pre-training for Task-Oriented Dialog Understanding and Generation
    He, Wanwei
    Dai, Yinpei
    Yang, Min
    Sun, Jian
    Huang, Fei
    Si, Luo
    Li, Yongbin
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 187 - 200
  • [4] BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
    Li, Junnan
    Li, Dongxu
    Xiong, Caiming
    Hoi, Steven
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [5] MLUG: Bootstrapping Language-Motion Pre-Training for Unified Motion-Language Understanding and Generation
    Luo, Hongliang
    Xi, Wei
    Tang, Daniel
    SENSORS, 2024, 24 (22)
  • [6] Multimodal Pre-training Method for Vision-language Understanding and Generation
    Liu T.-Y.
    Wu Z.-X.
    Chen J.-J.
    Jiang Y.-G.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (05): : 2024 - 2034
  • [7] Understanding tables with intermediate pre-training
    Eisenschlos, Julian Martin
    Krichene, Syrine
    Mueller, Thomas
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
  • [8] PreQR: Pre-training Representation for SQL Understanding
    Tang, Xiu
    Wu, Sai
    Song, Mingli
    Ying, Shanshan
    Li, Feifei
    Chen, Gang
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 204 - 216
  • [9] PRE-TRAINING PROGRAM FOR GRADUATE TEACHING ASSISTANTS
    HOUK, CC
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 1969, (APR): : CH06 - +
  • [10] Graph Pre-training for AMR Parsing and Generation
    Bai, Xuefeng
    Chen, Yulong
    Zhang, Yue
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6001 - 6015