A quantitative tool for measuring the quality of medical training in emergency medicine

被引:2
|
作者
Smith, Simon M. [1 ]
Davis, Peyton [2 ]
Davies, Llion [3 ]
机构
[1] Oxford Deanery, Hlth Educ Thames Valley, Oxford, England
[2] Oxford Univ Hosp NHS Trust, Emergency Dept, Oxford, England
[3] Royal Glamorgan Hosp, Dept Gen Surg, Llantrisant, Wales
关键词
educational measurement; graduate medical education; programme evaluation; quality in medical education; EDUCATION; FEEDBACK;
D O I
10.1097/MEJ.0000000000000257
中图分类号
R4 [临床医学];
学科分类号
1002 ; 100602 ;
摘要
The most common method of assessing the quality of medical education is through a selection of qualitative assessments, usually as part of a programme evaluation. Common qualitative assessments include measurements of students' and teachers' participation, outcome measures such as assessment results, and qualitative assessments such as interviews and questionnaires of students and teachers. Programme evaluation can therefore be a process that is both laborious and subject to accusations of a lack of objectivity. As a result, the development of a quantitative tool that could be used alongside a programme evaluation may be both useful and desirable. A pragmatic scoring system, utilizing routinely collected quantitative data, termed as the Quality Assessment Tool, was developed during the 2013 academic year within the setting of an Emergency Medicine training programme in the UK. This tool was tested against the standard assessment currently used for this programme to establish whether the quantitative tool correlated with the programme evaluation. Second, the individual items within the tool were investigated to identify any correlations with the current assessment of quality established by the programme evaluation. The Quality Assessment Tool appears to be correlated to the quality of training delivered at individual training sites in a single specialty. It certainly identifies those centres delivering the highest quality of training and also identifies those centres whose training is consistently of a lower standard. The assessment tool is less accurate at ranking those training centres whose training is merely satisfactory'; whether this is a result of the imprecision of the tool itself or a reflection of the subjective nature of the current assessment (i.e. whether the current evaluation system lacks validity) cannot be stated. In summary, it appears to be possible to use a single quantitative tool to reliably, and with validity, measure the quality of training delivered at a postgraduate medical training centre. Although it is not envisaged that this would, or should, replace ongoing quality assurance systems such as programme evaluations, a validated tool can be derived for a given setting to usefully inform and augment current quality management systems in postgraduate medical education. Copyright (C) 2015 Wolters Kluwer Health, Inc. All rights reserved.
引用
收藏
页码:436 / 439
页数:4
相关论文
共 50 条
  • [21] The Current State of Emergency Medical Training in US Schools of Podiatric Medicine
    Wald, David A.
    Wang, Alvin
    Suh, John
    Pontious, Jane
    Miller, Christine
    [J]. JOURNAL OF THE AMERICAN PODIATRIC MEDICAL ASSOCIATION, 2015, 105 (01) : 47 - 50
  • [22] An economic evaluation of the costs of training a medical scribe to work in Emergency Medicine
    Walker, Katherine J.
    Dunlop, Will
    Liew, Danny
    Staples, Margaret P.
    Johnson, Matt
    Ben-Meir, Michael
    Rodda, Hamish Gordon
    Turner, Ian
    Phillips, David
    [J]. EMERGENCY MEDICINE JOURNAL, 2016, 33 (12) : 865 - 869
  • [23] Graduate medical education reform: Implications for emergency medicine residency training
    Asplin, BR
    Schafermeyer, RW
    [J]. ANNALS OF EMERGENCY MEDICINE, 1998, 32 (02) : 253 - 255
  • [24] High-fidelity medical simulation for geriatrics emergency medicine training
    Ragsdale, L.
    Gutman, D.
    Tuya, A.
    [J]. JOURNAL OF THE AMERICAN GERIATRICS SOCIETY, 2008, 56 (04) : S43 - S43
  • [25] Advanced Medical Simulation Applications for Emergency Medicine Microsystems Evaluation and Training
    Kobayashi, Leo
    Overly, Frank L.
    Fairbanks, Rollin J.
    Patterson, Mary
    Kaji, Amy H.
    Bruno, Eric C.
    Kirchhoff, Michael A.
    Strother, Christopher G.
    Sucov, Andrew
    Wears, Robert L.
    [J]. ACADEMIC EMERGENCY MEDICINE, 2008, 15 (11) : 1058 - 1070
  • [26] The Impact of a Medical Toxicology Fellowship on the Training of Future Emergency Medicine Residents
    Jang, David H.
    [J]. JOURNAL OF MEDICAL TOXICOLOGY, 2010, 6 (04) : 371 - 372
  • [27] Towards a Probabilistic Interactive Video Training Tool for Emergency Medical Services
    Herault, Romain C.
    [J]. PROCEEDINGS OF THE 2024 ACM INTERNATIONAL CONFERENCE ON INTERACTIVE MEDIA EXPERIENCES, IMX 2024, 2024, : 416 - 420
  • [28] Assessment of Professionalism During the Emergency Medicine Clerkship Using the National Clinical Assessment Tool for Medical Students in Emergency Medicine
    Emery, Matt
    Parsa, Michael D.
    Watsjold, Bjorn K.
    Franzen, Doug
    [J]. AEM EDUCATION AND TRAINING, 2021, 5 (02)
  • [29] A National Study Examining Emergency Medicine Specialty Training and Quality Measures in the Emergency Department
    Mathews, Simon C.
    Kelen, Gabor D.
    Pronovost, Peter J.
    Pham, Julius Cuong
    [J]. AMERICAN JOURNAL OF MEDICAL QUALITY, 2010, 25 (06) : 429 - 435
  • [30] TRAINING FOR INTERNAL MEDICINE AND EMERGENCY MEDICINE
    HEDGES, JR
    [J]. JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 1988, 259 (02): : 219 - 220