Assessment of Anatomical Knowledge by Practical Examinations: The Effect of Question Design on Student Performance

被引:14
|
作者
Sagoo, Mandeep Gill [1 ,2 ]
Smith, Claire France [3 ]
Gosden, Edward [4 ]
机构
[1] Kings Coll London, Dept Anat & Human Sci, Guys Campus, London SE1 9RT, England
[2] UCL, Inst Educ, London, England
[3] Univ Sussex, Dept Anat, Brighton & Sussex Med Sch, Brighton, E Sussex, England
[4] Queensland Univ Technol, Res Methods Grp, Inst Hlth & Biomed Innovat, Brisbane, Qld, Australia
关键词
gross anatomy education; undergraduate education; medical education; anatomical sciences; anatomy assessments; anatomical visual resources; medical students' scores; medical students' views; MEDICAL-SCHOOL; GROSS-ANATOMY; COMPETENCE; EDUCATION; PAPER;
D O I
10.1002/ase.1597
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
The objective structural practical examination (OSPE) is a timed examination that assesses topographical and/or applied knowledge of anatomy with the use of cadaveric resources and medical images. This study investigated whether elements of question design (provision of clinical context, type of visual resources used, gender context, and difficulty) of an anatomy question affected students' performance and also whether there was any effect of basic demography or participation in various voluntary activities. Study participants were second-year medical students (n=150), 83 of whom consented to fill in a questionnaire collecting demographics, revision preferences, and assessment preferences. The examination scores were matched with students' responses collected on the questionnaire and all data analyzed by multiple linear regression. Difficulty of the question was the only design element found to be significantly associated with the number of students that answered correctly (P=0.001); clinical context, visual resources used and gender of the question were not significant. When individual students' marks were analyzed along with the questionnaire data, only the students' interest in participating in department's demonstrator program was a significant predictor of a high individual score, gender of the students showed a strong trend toward significance, with female students scoring on average higher than male students. The two part OSPE questions were dissociated and analyzed using binary logistic regression to determine whether a correct answer to Part 1 (identification of a tagged or pinned anatomical structures on a specimen or medical image) was predictive of a correct answer to Part 2 (assessment of the relevant functional, applied, or clinical knowledge), but no association was found. Anat Sci Educ 9: 446-452. (C) 2016 American Association of Anatomists.
引用
收藏
页码:446 / 452
页数:7
相关论文
共 50 条
  • [1] Does alternating dissection effect student performance on practical examinations
    Boehm, Karl
    Fraticelli, Nicole
    Jordan, Rebecca
    Riccardi, Margaret
    Houser, Jeremy
    Kondrashov, Peter
    FASEB JOURNAL, 2014, 28 (01):
  • [2] Student performance on practical gross anatomy examinations is not affected by assessment modality
    Meyer, Amanda J.
    Innes, Stanley I.
    Stomski, Norman J.
    Armson, Anthony J.
    ANATOMICAL SCIENCES EDUCATION, 2016, 9 (02) : 111 - 120
  • [3] Effect of incentives on student performance on Milemarker examinations
    Sansgiry, Sujit S.
    Chanda, Surupa
    Lemke, Thomas
    Szilagyi, Julianna E.
    AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION, 2006, 70 (05)
  • [4] Effect of unproctored versus proctored examinations on student performance and long-term retention of knowledge
    Zhang, Niu
    Larose, James
    Franklin, Megan
    JOURNAL OF CHIROPRACTIC EDUCATION, 2024, 38 (02): : 114 - 119
  • [5] Assessment Question Types Correlate with Student Performance in Pathology
    Hernandez, Tahyna
    Nair, Varsha
    Polydorides, Alexandros
    MODERN PATHOLOGY, 2020, 33 (SUPPL 2) : 537 - 537
  • [6] Assessment Question Types Correlate with Student Performance in Pathology
    Hernandez, Tahyna
    Nair, Varsha
    Polydorides, Alexandros
    LABORATORY INVESTIGATION, 2020, 100 (SUPPL 1) : 537 - 537
  • [7] Deep learning in the marking of medical student short answer question examinations: Student perceptions and pilot accuracy assessment
    Hollis-Sando, L.
    Pugh, C.
    Franke, K.
    Zerner, T.
    Tan, Y.
    Carneiro, G.
    van den Hengel, A.
    Symonds, I.
    Duggan, P.
    Bacchi, S.
    FOCUS ON HEALTH PROFESSIONAL EDUCATION-A MULTIDISCIPLINARY JOURNAL, 2023, 24 (01): : 38 - 48
  • [8] On the Effect of Question Ordering on Performance and Confidence in Computer Science Examinations
    Harrington, Brian
    Li, Jingyiran
    Moustafa, Mohamed
    Ahmadzadeh, Marzieh
    Cheng, Nick
    SIGCSE '19: PROCEEDINGS OF THE 50TH ACM TECHNICAL SYMPOSIUM ON COMPUTER SCIENCE EDUCATION, 2019, : 620 - 626
  • [9] Practical randomly selected question exam design to address replicated and sequential questions in online examinations
    Ahmed M. Elkhatat
    International Journal for Educational Integrity, 18
  • [10] Practical randomly selected question exam design to address replicated and sequential questions in online examinations
    Elkhatat, Ahmed M.
    INTERNATIONAL JOURNAL FOR EDUCATIONAL INTEGRITY, 2022, 18 (01)