Development and validity evidence for the intraprofessional conflict exercise: An assessment tool to support collaboration

被引:0
|
作者
Bajwa, Nadia M. [1 ,2 ]
Sader, Julia [2 ]
Kim, Sara [3 ]
Park, Yoon Soo [4 ]
Nendaz, Mathieu R. [2 ]
Bochatay, Naike [5 ]
机构
[1] Geneva Univ Hosp Geneva, Childrens Hosp, Dept Gen Pediat, Geneva, Switzerland
[2] Univ Geneva, Unit Dev & Res Med Educ UDREM, Fac Med, Geneva, Switzerland
[3] Univ Washington, Sch Med, Dept Surg, Seattle, WA USA
[4] Massachusetts Gen Hosp, MGH Inst Hlth Profess, Boston, MA USA
[5] Univ Calif San Francisco, Dept Pediat, San Francisco, CA USA
来源
PLOS ONE | 2023年 / 18卷 / 02期
关键词
PATIENT SAFETY; PHYSICIANS; POWER; WORK;
D O I
10.1371/journal.pone.0280564
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Background Effective collaboration is the foundation for delivering safe, high quality patient care. Health sciences curricula often include interprofessional collaboration training but may neglect conflicts that occur within a profession (intraprofessional). We describe the development of and validity evidence for an assessment of intraprofessional conflict management. Methods and findings We designed a 22-item assessment, the Intraprofessional Conflict Exercise, to evaluate skills in managing intraprofessional conflicts based on a literature review of conflict management. Using Messick's validity framework, we collected evidence for content, response process, and internal structure during a simulated intraprofessional conflict from 2018 to 2019. We performed descriptive statistics, inter-rater reliability, Cronbach's alpha, generalizability theory, and factor analysis to gather validity evidence. Two trained faculty examiners rated 82 trainees resulting in 164 observations. Inter-rater reliability was fair, weighted kappa of 0.33 (SE = 0.03). Cronbach's alpha was 0.87. The generalizability study showed differentiation among trainees (19.7% person variance) and was highly reliable, G-coefficient 0.88, Phi-coefficient 0.88. The decision study predicted that using one rater would have high reliability, G-coefficient 0.80. Exploratory factor analysis demonstrated three factors: communication skills, recognition of limits, and demonstration of respect for others. Based on qualitative observations, we found all items to be applicable, highly relevant, and helpful in identifying how trainees managed intraprofessional conflict. Conclusions The Intraprofessional Conflict Exercise provides a useful and reliable way to evaluate intraprofessional conflict management skills. It provides meaningful and actionable feedback to trainees and may help health educators in preparing trainees to manage intraprofessional conflict.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Development of the Diabetic Wound Assessment Learning Tool (DiWALT) and validity evidence
    Selim, Omar
    Dueck, Andrew
    Walsh, Catharine M.
    Brydges, Ryan
    Okrainec, Allan
    [J]. JOURNAL OF VASCULAR SURGERY, 2021, 73 (02) : 689 - 697
  • [2] A Medical Student Inquiry Behavior Assessment Tool: Development and Validity Evidence
    Brondfield, Sam
    Boscardin, Christy
    Strewler, Gordon
    Hyland, Katherine
    Oakes, Scott A.
    Nishimura, Holly
    Crawford, Jenny
    Hauer, Karen E.
    [J]. ACADEMIC MEDICINE, 2019, 94 (04) : 585 - 593
  • [3] ERCP assessment tool: evidence of validity and competency development during training
    Siau, Keith
    Dunckley, Paul
    Feeney, Mark
    Johnson, Gavin
    [J]. ENDOSCOPY, 2019, 51 (11) : 1017 - 1026
  • [4] Development and testing of a DVT risk assessment tool: Providing evidence of validity and reliability
    McCaffrey, Ruth
    Bishop, Mary
    Adonis-Rizzo, Marie
    Williamson, Ellen
    McPherson, Melanie
    Cruikshank, Alice
    Carrier, Vicki Jo
    Sands, Simone
    Pigano, Diane
    Girard, Patricia
    Lauzon, Cathy
    [J]. WORLDVIEWS ON EVIDENCE-BASED NURSING, 2007, 4 (01) : 14 - 20
  • [5] Evidence of validity and reliability of a phonological assessment tool
    Ceron, Marizete Ilha
    Gubiani, Marileda Barichello
    de Oliveira, Camila Rosa
    Keske-Soares, Marcia
    [J]. CODAS, 2018, 30 (03):
  • [6] A Case Study on Tool Support for Collaboration in Agile Development
    Calefato, Fabio
    Giove, Andrea
    Lanubile, Filippo
    Losavio, Marco
    [J]. 2020 ACM/IEEE 15TH INTERNATIONAL CONFERENCE ON GLOBAL SOFTWARE ENGINEERING, ICGSE, 2020, : 11 - 21
  • [7] THE FLEXIBLE SIGMOIDOSCOPY DOPS ASSESSMENT TOOL: EVIDENCE OF VALIDITY AND COMPETENCY DEVELOPMENT DURING TRAINING
    Siau, Keith
    Crossley, James
    Dunckley, Paul
    Johnson, Gavin
    Haycock, Adam
    Anderson, John
    Iacucci, Marietta
    Feeney, Mark
    [J]. GUT, 2019, 68 : A252 - A253
  • [8] DEVELOPMENT AND VALIDITY EVIDENCE FOR A VIDEO ASSESSMENT TOOL OF ENDOSCOPIC SUBMUCOSAL DISSECTION FOR GASTRIC NEOPLASMS
    Takao, Madoka
    Bilgic, Elif
    Waschke, Kevin A.
    Kaneva, Pepa
    Endo, Satoshi
    Ohara, Yoshiko
    Kawara, Fumiaki
    Tanaka, Shinwa
    Ishida, Tsukasa
    Morita, Yoshinori
    Toyonaga, Takashi
    Umegaki, Eiji
    Fried, Gerald M.
    [J]. GASTROINTESTINAL ENDOSCOPY, 2018, 87 (06) : AB504 - AB504
  • [9] Development of a Workplace Assessment Tool for Interprofessional Collaboration (WATIC)
    Lackie, Kelly
    Banfield, Valerie
    [J]. JOURNAL OF INTERPROFESSIONAL CARE, 2013, 27 : 155 - 156
  • [10] Burnout Assessment Tool (BAT)-Development, Validity, and Reliability
    Schaufeli, Wilmar B.
    Desart, Steffie
    De Witte, Hans
    [J]. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2020, 17 (24) : 1 - 21