共 1 条
Video Recording in Veterinary Medicine OSCEs: Feasibility and Inter-rater Agreement between Live Performance Examiners and Video Recording Reviewing Examiners
被引:14
|作者:
Tan, Jean-Yin
[1
]
Ma, Irene W. Y.
[2
]
Hunt, Julie A.
[3
,4
]
Kwong, Grace P. S.
[5
]
Farrell, Robin
[6
]
Bell, Catriona
[7
,8
]
Read, Emma K.
[9
]
机构:
[1] Univ Calgary, Dept Vet Clin & Diagnost Sci, Fac Vet Med, Clin Skills Bldg,112A,11877-85th St NW, Calgary, AB T3R 1J3, Canada
[2] Univ Calgary, Div Gen Internal Med, Cumming Sch Med, 3330 Hosp Dr, Calgary, AB T2N 4N1, Canada
[3] Lincoln Mem Univ, Small Anim Clin Skills, Coll Vet Med, 6965 Cumberland Gap Pkwy, Harrogate, TN 37752 USA
[4] Lincoln Mem Univ, Clin Skills, Coll Vet Med, 6965 Cumberland Gap Pkwy, Harrogate, TN 37752 USA
[5] Univ Calgary, Fac Vet Med, TRW IE07,3280 Hosp Dr NW, Calgary, AB T2N 4Z6, Canada
[6] Univ Coll Dublin, Univ Coll, Sch Vet Med, Vet Nursing Sect,Vet Sci Ctr 040, Dublin 4, Ireland
[7] Adv HE, Teaching Excellence Awards, Holyrood Pk House,106 Holyrood Rd, Edinburgh EH8 8AS, Midlothian, Scotland
[8] Adv HE, Holyrood Pk House,106 Holyrood Rd, Edinburgh EH8 8AS, Midlothian, Scotland
[9] Ohio State Univ, Profess Programs, Coll Vet Med, Off Profess Programs, 127E Vet Med Acad Bldg,1900 Coffey Rd, Columbus, OH 43210 USA
关键词:
OSCE;
video review;
rater;
assessment;
clinical skills;
recording;
digital;
technology;
medical education;
inter-rater agreement;
CLINICAL EXAMINATION;
FAMILIARITY;
SKILLS;
RELIABILITY;
D O I:
10.3138/jvme-2019-0142
中图分类号:
G40 [教育学];
学科分类号:
040101 ;
120403 ;
摘要:
The Objective Structured Clinical Examination (OSCE) is a valid, reliable assessment of veterinary students' clinical skills that requires significant examiner training and scoring time. This article seeks to investigate the utility of implementing video recording by scoring OSCEs in real-time using live examiners, and afterwards using video examiners from within and outside the learners' home institution. Using checklists, learners (n=33) were assessed by one live examiner and five video examiners on three OSCE stations: suturing, arthrocentesis, and thoracocentesis. When stations were considered collectively, there was no difference between pass/fail outcome between live and video examiners (chi(2) = 0.37, p = .55). However, when considered individually, stations (chi(2) = 16.64, p < .001) and interaction between station and type of examiner (chi(2) = 7.13, p = .03) demonstrated a significant effect on pass/fail outcome. Specifically, learners being assessed on suturing with a video examiner had increased odds of passing the station as compared with their arthrocentesis or thoracocentesis stations. Internal consistency was fair to moderate (0.34-0.45). Inter-rater reliability measures varied but were mostly moderate to strong (0.56-0.82). Video examiners spent longer assessing learners than live raters (mean of 21 min/learner vs. 13 min/learner). Station-specific differences among video examiners may be due to intermittent visibility issues during video capture. Overall, video recording learner performances appears reliable and feasible, although there were time, cost, and technical issues that may limit its routine use.
引用
收藏
页码:485 / 491
页数:7
相关论文