How does ChatGPT perform on the European Board of Pediatric Surgery examination? A randomized comparative study

被引:2
|
作者
Azizoglu, Mustafa [1 ]
Aydogdu, Bahattin [2 ]
机构
[1] Dicle Univ, Med Sch, Dept Pediat Surg, Diyarbakir, Turkiye
[2] Balikesir Univ, Dept Pediat Surg, Balikesir, Turkiye
来源
MEDICINA BALEAR | 2024年 / 39卷 / 01期
关键词
ChatGPT; Pediatric Surgery; exam; questions; artificial intelligence;
D O I
10.3306/AJHS.2024.39.01.23
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Purpose: The purpose of this study was to conduct a detailed comparison of the accuracy and responsiveness of GPT-3.5 and GPT-4 in the realm of pediatric surgery. Specifically, we sought to assess their ability to correctly answer a series of sample questions of European Board of Pediatric Surgery (EBPS) exam. Methods: This study was conducted between 20 May 2023 and 30 May 2023. This study undertook a comparative analysis of two AI language models, GPT-3.5 and GPT-4, in the field of pediatric surgery, particularly in the context of EBPS exam sample questions. Two sets of 105 (total 210) sample questions each, derived from the EBPS sample questions, were collated. Results: In General Pediatric Surgery, GPT-3.5 provided correct answers for 7 questions (46.7%), and GPT-4 had a higher accuracy with 13 correct responses (86.7%) (p=0.020). For Newborn Surgery and Pediatric Urology, GPT-3.5 correctly answered 6 questions (40.0%), and GPT-4, however, correctly answered 12 questions (80.0%) (p= 0.025). In total, GPT-3.5 correctly answered 46 questions out of 105 (43.8%), and GPT-4 showed significantly better performance, correctly answering 80 questions (76.2%) (p<0.001). Given the total responses, when GPT-4 was compared with GPT-3.5, the Odds Ratio was found to be 4.1. This suggests that GPT-4 was 4.1 times more likely to provide a correct answer to the pediatric surgery questions compared to GPT-3.5. Conclusion: This comparative study concludes that GPT-4 significantly outperforms GPT-3.5 in responding to EBPS exam questions.
引用
收藏
页码:23 / 26
页数:4
相关论文
共 50 条
  • [22] How to perform an excellent radiology board examination: a web-based checklist
    Dicle, Oguz
    Ozan, Sema
    Sahin, Hatice
    Secil, Mustafa
    INSIGHTS INTO IMAGING, 2021, 12 (01)
  • [23] Performance of ChatGPT on American Board of Surgery In-Training Examination Preparation Questions
    Tran, Catherine G.
    Chang, Jeremy
    Sherman, Scott K.
    De Andrade, James P.
    JOURNAL OF SURGICAL RESEARCH, 2024, 299 : 329 - 335
  • [24] How to perform an excellent radiology board examination: a web-based checklist
    Oğuz Dicle
    Sema Özan
    Hatice Şahin
    Mustafa Seçil
    Insights into Imaging, 12
  • [25] Performance of ChatGPT on the Fellow of the European Board of Urology (FEBU) exams: A comparative analysis
    Schoch, J.
    Schmelz, H-U.
    Borgmann, H.
    Nestler, T.
    EUROPEAN UROLOGY, 2024, 85 : S923 - S924
  • [26] Performance of ChatGPT-3.5 and ChatGPT-4 on the European Board of Urology (EBU) exams: a comparative analysis
    Schoch, Justine
    Schmelz, H. -u.
    Strauch, Angelina
    Borgmann, Hendrik
    Nestler, Tim
    WORLD JOURNAL OF UROLOGY, 2024, 42 (01)
  • [27] Does Success on the American Board of Surgery General Surgery Qualifying Examination Guarantee Certifying Examination Success?
    Biester, Thomas W.
    Rubright, Jonathan D.
    Jones, Andrew T.
    Malangoni, Mark A.
    JOURNAL OF SURGICAL EDUCATION, 2012, 69 (06) : 731 - 734
  • [28] Comparative Performance of ChatGPT 3.5 and GPT4 on Rhinology Standardized Board Examination Questions
    Patel, Evan A.
    Fleischer, Lindsay
    Filip, Peter
    Eggerstedt, Michael
    Hutz, Michael
    Michaelides, Elias
    Batra, Pete S.
    Tajudeen, Bobby A.
    OTO OPEN, 2024, 8 (02)
  • [29] Germany and European security governance: how well does the Birmingham model perform?
    Sperling, James
    EUROPEAN SECURITY, 2009, 18 (02) : 125 - 150
  • [30] Can Artificial Intelligence Pass the American Board of Orthopaedic Surgery Examination? Orthopaedic Residents Versus ChatGPT
    Lum, Zachary C.
    CLINICAL ORTHOPAEDICS AND RELATED RESEARCH, 2023, 481 (08) : 1623 - 1630