共 17 条
- [1] Towards JavaScript program repair with Generative Pre-trained Transformer (GPT-2) Proceedings - International Workshop on Automated Program Repair, APR 2022, 2022, : 61 - 68
- [2] Towards Java']JavaScript program repair with Generative Pre-trained Transformer (GPT-2) INTERNATIONAL WORKSHOP ON AUTOMATED PROGRAM REPAIR (APR 2022), 2022, : 61 - 68
- [3] JiuZhang: A Chinese Pre-trained Language Model for Mathematical Problem Understanding PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4571 - 4581
- [4] Performance of the pre-trained large language model GPT-4 on automated short answer grading Discover Artificial Intelligence, 2024, 4 (01):
- [5] GPT-2C: A Parser for Honeypot Logs Using Large Pre-trained Language Models PROCEEDINGS OF THE 2021 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING, ASONAM 2021, 2021, : 649 - 653
- [6] How large language models including generative pre-trained transformer (GPT) 3 and 4 will impact medicine and surgery Techniques in Coloproctology, 2023, 27 : 609 - 614
- [8] How Robust Is a Large Pre -trained Language Model for Code Generation? A Case on Attacking GPT2 2023 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ANALYSIS, EVOLUTION AND REENGINEERING, SANER, 2023, : 708 - 712
- [9] JiuZhang 2.0: A Unified Chinese Pre-trained Language Model for Multi-task Mathematical Problem Solving PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5660 - 5672