AUGER: Automatically Generating Review Comments with Pre-training Models

被引:0
|
作者
Li, Lingwei [1 ]
Yang, Li [2 ]
Jiang, Huaxi [1 ]
Yan, Jun [3 ]
Luo, Tiejian [4 ]
Hua, Zihan [5 ]
Liang, Geng [2 ]
Zuo, Chun [6 ]
机构
[1] Institute of Software, CAS, Univ. of Chinese Academy of Sciences, Beijing, China
[2] Institute of Software, CAS, Beijing, China
[3] State Key Laboratory of Computer Science, Institute of Software, CAS, Univ. of Chinese Academy of Sciences, Beijing, China
[4] Univ. of Chinese Academy of Sciences, Beijing, China
[5] Wuhan University, Univ. of Chinese Academy of Sciences, Wuhan, China
[6] Sinosoft Company Limited, Beijing, China
来源
arXiv | 2022年
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Machine learning
引用
收藏
相关论文
共 50 条
  • [21] On the importance of pre-training data volume for compact language models
    Micheli, Vincent
    D'Hoffschmidt, Martin
    Fleuret, Francois
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7853 - 7858
  • [22] Pre-training the deep generative models with adaptive hyperparameter optimization
    Yao, Chengwei
    Cai, Deng
    Bu, Jiajun
    Chen, Gencai
    [J]. NEUROCOMPUTING, 2017, 247 : 144 - 155
  • [23] Knowledge Transfer via Pre-training for Recommendation: A Review and Prospect
    Zeng, Zheni
    Xiao, Chaojun
    Yao, Yuan
    Xie, Ruobing
    Liu, Zhiyuan
    Lin, Fen
    Lin, Leyu
    Sun, Maosong
    [J]. FRONTIERS IN BIG DATA, 2021, 4
  • [24] Rethinking ImageNet Pre-training
    He, Kaiming
    Girshick, Ross
    Dollar, Piotr
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 4917 - 4926
  • [25] Pre-Training to Learn in Context
    Gu, Yuxian
    Dong, Li
    Wei, Furu
    Huang, Minlie
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 4849 - 4870
  • [26] Improving Fractal Pre-training
    Anderson, Connor
    Farrell, Ryan
    [J]. 2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 2412 - 2421
  • [27] Pre-training via Paraphrasing
    Lewis, Mike
    Ghazvininejad, Marjan
    Ghosh, Gargi
    Aghajanyan, Armen
    Wang, Sida
    Zettlemoyer, Luke
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [28] THE PRE-TRAINING SELECTION OF TEACHERS
    Barr, A. S.
    Douglas, Lois
    [J]. JOURNAL OF EDUCATIONAL RESEARCH, 1934, 28 (02): : 92 - 117
  • [29] A comparison of supervised and unsupervised pre-training of end-to-end models
    Misra, Ananya
    Hwang, Dongseong
    Huo, Zhouyuan
    Garg, Shefali
    Siddhartha, Nikhil
    Narayanan, Arun
    Sim, Khe Chai
    [J]. INTERSPEECH 2021, 2021, : 731 - 735
  • [30] Photo Pre-Training, But for Sketch
    Ke, L.
    Pang, Kaiyue
    Song, Yi-Zhe
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 2754 - 2764