Validity Evidence for a Knowledge Assessment Tool for a Mastery Learning Scrub Training Curriculum

被引:2
|
作者
Hasty, Brittany N. [1 ,2 ]
Lau, James N. [3 ]
Tekian, Ara [4 ]
Miller, Sarah E. [5 ,6 ]
Shipper, Edward S. [7 ,8 ]
Merrell, Sylvia Bereknyei [9 ,10 ]
Lee, Edmund W. [1 ,11 ]
Park, Yoon Soo [12 ]
机构
[1] Stanford Univ, Sch Med, Dept Surg, 300 Pasteur Dr,H3591B, Stanford, CA 94305 USA
[2] Loyola Univ, Med Ctr, Gen Surg, 2160 S 1st Ave, Maywood, IL 60153 USA
[3] Stanford Univ, Sch Med, Dept Surg, Surg, Stanford, CA 94305 USA
[4] Univ Illinois, Dept Med Educ, Int Affairs, Chicago, IL USA
[5] Stanford Univ, Sch Med, Stanford, CA 94305 USA
[6] Stanford Univ, Sch Med, Obstet & Gynecol, Stanford, CA 94305 USA
[7] Univ Texas Hlth Sci Ctr San Antonio, San Antonio, TX 78229 USA
[8] Natl Trauma Inst, San Antonio, TX USA
[9] Stanford Univ, Sch Med, Goodman Surg Educ Ctr, Res, Stanford, CA 94305 USA
[10] Stanford Univ, Sch Med, Stanford Surg Policy Improvement Res & Educ Ctr S, Stanford, CA 94305 USA
[11] Inova Fairfax Med Campus, Gen Surg, Falls Church, VA USA
[12] Univ Illinois, Dept Med Educ, Chicago, IL USA
关键词
MEDICAL-STUDENTS;
D O I
10.1097/ACM.0000000000003007
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Purpose To examine the validity evidence for a scrub training knowledge assessment tool to demonstrate the utility and robustness of a multimodal, entrustable professional activity (EPA)-aligned, mastery learning scrub training curriculum. Method Validity evidence was collected for the knowledge assessment used in the scrub training curriculum at Stanford University School of Medicine from April 2017 to June 2018. The knowledge assessment had 25 selected response items that mapped to curricular objectives, EPAs, and operating room policies. A mastery passing standard was established using the Mastery Angoff and Patient-Safety approaches. Learners were assessed pre curriculum, post curriculum, and 6 months after the curriculum. Results From April 2017 to June 2018, 220 medical and physician assistant students participated in the scrub training curriculum. The mean pre- and postcurriculum knowledge scores were 74.4% (standard deviation [SD] = 15.6) and 90.1% (SD = 8.3), respectively, yielding a Cohen's d = 1.10, P < .001. The internal reliability of the assessment was 0.71. Students with previous scrub training performed significantly better on the precurriculum knowledge assessment than those without previous training (81.9% [SD = 12.6] vs 67.0% [SD = 14.9]; P < .001). The mean item difficulty was 0.74, and the mean item discrimination index was 0.35. The Mastery Angoff overall cut score was 92.0%. Conclusions This study describes the administration of and provides validity evidence for a knowledge assessment tool for a multimodal, EPA-aligned, mastery-based curriculum for scrub training. The authors support the use of scores derived from this test for assessing scrub training knowledge among medical and physician assistant students.
引用
收藏
页码:129 / 135
页数:7
相关论文
共 50 条
  • [1] Collecting Validity Evidence for the Assessment of Mastery Learning in Simulation-Based Ultrasound Training
    Dyre, L.
    Norgaard, L. N.
    Tabor, A.
    Madsen, M. E.
    Sorensen, J. L.
    Ringsted, C.
    Tolsgaard, M.
    [J]. ULTRASCHALL IN DER MEDIZIN, 2016, 37 (04): : 382 - 388
  • [2] ERCP assessment tool: evidence of validity and competency development during training
    Siau, Keith
    Dunckley, Paul
    Feeney, Mark
    Johnson, Gavin
    [J]. ENDOSCOPY, 2019, 51 (11) : 1017 - 1026
  • [3] Development of the Diabetic Wound Assessment Learning Tool (DiWALT) and validity evidence
    Selim, Omar
    Dueck, Andrew
    Walsh, Catharine M.
    Brydges, Ryan
    Okrainec, Allan
    [J]. JOURNAL OF VASCULAR SURGERY, 2021, 73 (02) : 689 - 697
  • [4] A mastery learning curriculum for medical students on performing a fall risk assessment
    van Zuilen, M. H.
    Ferrari, C.
    Danisi, J.
    [J]. JOURNAL OF THE AMERICAN GERIATRICS SOCIETY, 2020, 68 : S160 - S161
  • [5] THE FLEXIBLE SIGMOIDOSCOPY DOPS ASSESSMENT TOOL: EVIDENCE OF VALIDITY AND COMPETENCY DEVELOPMENT DURING TRAINING
    Siau, Keith
    Crossley, James
    Dunckley, Paul
    Johnson, Gavin
    Haycock, Adam
    Anderson, John
    Iacucci, Marietta
    Feeney, Mark
    [J]. GUT, 2019, 68 : A252 - A253
  • [6] Evidence of validity and reliability of a phonological assessment tool
    Ceron, Marizete Ilha
    Gubiani, Marileda Barichello
    de Oliveira, Camila Rosa
    Keske-Soares, Marcia
    [J]. CODAS, 2018, 30 (03):
  • [7] Food allergy needs assessment, training curriculum, and knowledge assessment for child care
    Lanser, Bruce J.
    Covar, Ronina
    Bird, J. Andrew
    [J]. ANNALS OF ALLERGY ASTHMA & IMMUNOLOGY, 2016, 116 (06) : 533 - +
  • [8] Training and assessment of anaesthesiologist skills: The contrasting groups method and mastery learning levels
    Jensen, J. K.
    Wisborg, T.
    [J]. ACTA ANAESTHESIOLOGICA SCANDINAVICA, 2018, 62 (06) : 742 - 743
  • [9] Reflection: A Tool for Learning and Assessment in Competency-Based Curriculum
    Shah, Nilima
    Gupta, Piyush
    Singh, Tejinder
    [J]. INDIAN PEDIATRICS, 2024, 61 (08) : 771 - 777
  • [10] Establishing the Validity and Reliability Evidence of Preceptor Assessment of Student Tool
    Zhou, Lili
    Almutairi, Abdulaali R.
    Alsaid, Nimer S.
    Warholak, Terri L.
    Cooley, Janet
    [J]. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION, 2017, 81 (08) : 10 - 20