Towards JavaScript program repair with Generative Pre-trained Transformer (GPT-2)

被引:0
|
作者
Lajko, Mark [1 ]
Csuvik, Viktor [1 ]
Vidacs, Laszlo [1 ]
机构
[1] University of Szeged, MTA-SZTE Research Group on Artificial Intelligence, Department of Software Engineering, Szeged, Hungary
关键词
Compendex;
D O I
3rd IEEE/ACM International Workshop on Automated Program Repair, APR 2022
中图分类号
学科分类号
摘要
Deep learning
引用
收藏
页码:61 / 68
相关论文
共 50 条
  • [41] MetaQA: Enhancing human-centered data search using Generative Pre-trained Transformer (GPT) language model and artificial intelligence
    Li, Diya
    Zhang, Zhe
    PLOS ONE, 2023, 18 (11):
  • [42] FLUID-GPT (Fast Learning to Understand and Investigate Dynamics with a Generative Pre-Trained Transformer): Efficient Predictions of Particle Trajectories and Erosion
    Yang, Steve D.
    Ali, Zulfikhar A.
    Wong, Bryan M.
    INDUSTRIAL & ENGINEERING CHEMISTRY RESEARCH, 2023, 62 (37) : 15278 - 15289
  • [43] Editorial Commentary: Generative Pre-trained Transformer 4 (GPT4) makes cardiovascular magnetic resonance reports easy to understand
    Banerjee, Imon
    Tariq, Amara
    Chao, Chieh-Ju
    JOURNAL OF CARDIOVASCULAR MAGNETIC RESONANCE, 2024, 26 (01)
  • [44] Enhancing emergency department charting: Using Generative Pre-trained Transformer-4 (GPT-4) to identify laceration repairs
    Bains, Jaskaran
    Williams, Christopher Y. K.
    Johnson, Drake
    Schwartz, Hope
    Sabbineni, Naina
    Butte, Atul J.
    Kornblith, Aaron E.
    ACADEMIC EMERGENCY MEDICINE, 2024,
  • [45] Using Generative Pre-Trained Transformers (GPT) for Electricity Price Trend Forecasting in the Spanish Market
    Medina, Alberto Menendez
    Alvaro, Jose Antonio Heredia
    ENERGIES, 2024, 17 (10)
  • [46] Can Generative Pre-trained Transformers (GPT) Pass Assessments in Higher Education Programming Courses?
    Savelka, Jaromir
    Agarwal, Arav
    Bogart, Christopher
    Song, Yifan
    Sakr, Majd
    PROCEEDINGS OF THE 2023 CONFERENCE ON INNOVATION AND TECHNOLOGY IN COMPUTER SCIENCE EDUCATION, ITICSE 2023, VOL 1, 2023, : 117 - 123
  • [47] Pre-Trained Image Processing Transformer
    Chen, Hanting
    Wang, Yunhe
    Guo, Tianyu
    Xu, Chang
    Deng, Yiping
    Liu, Zhenhua
    Ma, Siwei
    Xu, Chunjing
    Xu, Chao
    Gao, Wen
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 12294 - 12305
  • [48] Is generative pre-trained transformer artificial intelligence (Chat-GPT) a reliable tool for guidelines synthesis? A preliminary evaluation for biologic CRSwNP therapy
    Maniaci, Antonino
    Saibene, Alberto Maria
    Calvo-Henriquez, Christian
    Vaira, Luigi
    Radulesco, Thomas
    Michel, Justin
    Chiesa-Estomba, Carlos
    Sowerby, Leigh
    Lobo Duro, David
    Mayo-Yanez, Miguel
    Maza-Solano, Juan
    Lechien, Jerome Rene
    La Mantia, Ignazio
    Cocuzza, Salvatore
    EUROPEAN ARCHIVES OF OTO-RHINO-LARYNGOLOGY, 2024, 281 (04) : 2167 - 2173
  • [49] HiVeGPT: Human-Machine-Augmented Intelligent Vehicles With Generative Pre-Trained Transformer
    Zhang, Junping
    Pu, Jian
    Xue, Jianru
    Yang, Ming
    Xu, Xin
    Wang, Xiao
    Wang, Fei-Yue
    IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (03): : 2027 - 2033
  • [50] Universal skepticism of ChatGPT: a review of early literature on chat generative pre-trained transformer
    Watters, Casey
    Lemanski, Michal K.
    FRONTIERS IN BIG DATA, 2023, 6