AUGER: Automatically Generating Review Comments with Pre-training Models

被引:0
|
作者
Li, Lingwei [1 ]
Yang, Li [2 ]
Jiang, Huaxi [1 ]
Yan, Jun [3 ]
Luo, Tiejian [4 ]
Hua, Zihan [5 ]
Liang, Geng [2 ]
Zuo, Chun [6 ]
机构
[1] Institute of Software, CAS, Univ. of Chinese Academy of Sciences, Beijing, China
[2] Institute of Software, CAS, Beijing, China
[3] State Key Laboratory of Computer Science, Institute of Software, CAS, Univ. of Chinese Academy of Sciences, Beijing, China
[4] Univ. of Chinese Academy of Sciences, Beijing, China
[5] Wuhan University, Univ. of Chinese Academy of Sciences, Wuhan, China
[6] Sinosoft Company Limited, Beijing, China
来源
arXiv | 2022年
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Machine learning
引用
收藏
相关论文
共 50 条
  • [1] AUGER: Automatically Generating Review Comments with Pre-training Models
    Li, Lingwei
    Yang, Li
    Jiang, Huaxi
    Yan, Jun
    Luo, Tiejian
    Hua, Zihan
    Liang, Geng
    Zuo, Chun
    [J]. PROCEEDINGS OF THE 30TH ACM JOINT MEETING EUROPEAN SOFTWARE ENGINEERING CONFERENCE AND SYMPOSIUM ON THE FOUNDATIONS OF SOFTWARE ENGINEERING, ESEC/FSE 2022, 2022, : 1009 - 1021
  • [2] Multi-stage Pre-training over Simplified Multimodal Pre-training Models
    Liu, Tongtong
    Feng, Fangxiang
    Wang, Xiaojie
    [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2556 - 2565
  • [3] Realistic Channel Models Pre-training
    Huangfu, Yourui
    Wang, Jian
    Xu, Chen
    Li, Rong
    Ge, Yiqun
    Wang, Xianbin
    Zhang, Huazi
    Wang, Jun
    [J]. 2019 IEEE GLOBECOM WORKSHOPS (GC WKSHPS), 2019,
  • [4] Pre-training Mention Representations in Coreference Models
    Varkel, Yuval
    Globerson, Amir
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8534 - 8540
  • [5] A Method of Relation Extraction Using Pre-training Models
    Wang, Yu
    Sun, Yining
    Ma, Zuchang
    Gao, Lisheng
    Xu, Yang
    Wu, Yichen
    [J]. 2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 176 - 179
  • [6] Improving the Sample Efficiency of Pre-training Language Models
    Berend, Gabor
    [J]. ERCIM NEWS, 2024, (136): : 38 - 40
  • [7] Pre-Training Clustering Models to Summarize Vietnamese Texts
    Nguyen, Ti-Hon
    Do, Thanh-Nghi
    [J]. VIETNAM JOURNAL OF COMPUTER SCIENCE, 2024,
  • [8] Pre-training and diagnosing knowledge base completion models
    Kocijan, Vid
    Jang, Myeongjun
    Lukasiewicz, Thomas
    [J]. ARTIFICIAL INTELLIGENCE, 2024, 329
  • [9] Pre-training with Diffusion Models for Dental Radiography Segmentation
    Rousseau, Jeremy
    Alaka, Christian
    Covili, Emma
    Mayard, Hippolyte
    Misrachi, Laura
    Au, Willy
    [J]. DEEP GENERATIVE MODELS, DGM4MICCAI 2023, 2024, 14533 : 174 - 182
  • [10] Method for Automatically Generating Online Comments
    Liu, Xinran
    Xu, Yabin
    Li, Jixian
    [J]. Data Analysis and Knowledge Discovery, 2023, 7 (04) : 101 - 113