Development and validity evidence for the intraprofessional conflict exercise: An assessment tool to support collaboration

被引:0
|
作者
Bajwa, Nadia M. [1 ,2 ]
Sader, Julia [2 ]
Kim, Sara [3 ]
Park, Yoon Soo [4 ]
Nendaz, Mathieu R. [2 ]
Bochatay, Naike [5 ]
机构
[1] Geneva Univ Hosp Geneva, Childrens Hosp, Dept Gen Pediat, Geneva, Switzerland
[2] Univ Geneva, Unit Dev & Res Med Educ UDREM, Fac Med, Geneva, Switzerland
[3] Univ Washington, Sch Med, Dept Surg, Seattle, WA USA
[4] Massachusetts Gen Hosp, MGH Inst Hlth Profess, Boston, MA USA
[5] Univ Calif San Francisco, Dept Pediat, San Francisco, CA USA
来源
PLOS ONE | 2023年 / 18卷 / 02期
关键词
PATIENT SAFETY; PHYSICIANS; POWER; WORK;
D O I
10.1371/journal.pone.0280564
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Background Effective collaboration is the foundation for delivering safe, high quality patient care. Health sciences curricula often include interprofessional collaboration training but may neglect conflicts that occur within a profession (intraprofessional). We describe the development of and validity evidence for an assessment of intraprofessional conflict management. Methods and findings We designed a 22-item assessment, the Intraprofessional Conflict Exercise, to evaluate skills in managing intraprofessional conflicts based on a literature review of conflict management. Using Messick's validity framework, we collected evidence for content, response process, and internal structure during a simulated intraprofessional conflict from 2018 to 2019. We performed descriptive statistics, inter-rater reliability, Cronbach's alpha, generalizability theory, and factor analysis to gather validity evidence. Two trained faculty examiners rated 82 trainees resulting in 164 observations. Inter-rater reliability was fair, weighted kappa of 0.33 (SE = 0.03). Cronbach's alpha was 0.87. The generalizability study showed differentiation among trainees (19.7% person variance) and was highly reliable, G-coefficient 0.88, Phi-coefficient 0.88. The decision study predicted that using one rater would have high reliability, G-coefficient 0.80. Exploratory factor analysis demonstrated three factors: communication skills, recognition of limits, and demonstration of respect for others. Based on qualitative observations, we found all items to be applicable, highly relevant, and helpful in identifying how trainees managed intraprofessional conflict. Conclusions The Intraprofessional Conflict Exercise provides a useful and reliable way to evaluate intraprofessional conflict management skills. It provides meaningful and actionable feedback to trainees and may help health educators in preparing trainees to manage intraprofessional conflict.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Validity of back-translation as a quality assessment tool: empirical evidence
    Tyupa, Sergiy
    Wild, Diane
    QUALITY OF LIFE RESEARCH, 2016, 25 : 170 - 170
  • [22] Burnout Assessment Tool (BAT): Validity Evidence from Brazil and Portugal
    Sinval, Jorge
    Vazquez, Ana Claudia S.
    Hutz, Claudio Simon
    Schaufeli, Wilmar B.
    Silva, Silvia
    INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2022, 19 (03)
  • [23] Validity Evidence for the Neuro-Endoscopic Ventriculostomy Assessment Tool (NEVAT)
    Gaab, Michael R.
    OPERATIVE NEUROSURGERY, 2017, 13 (01) : 67 - 68
  • [24] Using the Model of Regulation to Understand Software Development Collaboration Practices and Tool Support
    Arciniegas-Mendez, Maryi
    Zagalsky, Alexey
    Storey, Margaret-Anne
    Hadwin, Allyson F.
    CSCW'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, 2017, : 1049 - 1065
  • [25] The High-Value Care Rounding Tool: Development and Validity Evidence
    McDaniel, Corrie E.
    White, Andrew A.
    Bradford, Miranda C.
    Sy, Carolyn D.
    Chen, Tiffany
    Brock, Doug
    Foti, Jeffrey
    Beck, Jimmy B.
    ACADEMIC MEDICINE, 2018, 93 (02) : 199 - 206
  • [26] The McMaster Narrative Comment Rating Tool: Development and Initial Validity Evidence
    Mcguire, Natalie
    Acai, Anita
    Sonnadara, Ranil R.
    TEACHING AND LEARNING IN MEDICINE, 2025, 37 (01) : 86 - 98
  • [27] Development and Validity of a Dimension-Based Assessment Center: Does Exercise Similarity Breed Validity?
    Yazar, Yagizhan
    Sumer, H. Canan
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2016, 51 : 808 - 808
  • [28] Development of the Hand Assessment for Infants: evidence of internal scale validity
    Krumlinde-Sundholm, Lena
    Ek, Linda
    Sicola, Elisa
    Sjostrand, Lena
    Guzzetta, Andrea
    Sgandurra, Giuseppina
    Cioni, Giovanni
    Eliasson, Ann-Christin
    DEVELOPMENTAL MEDICINE AND CHILD NEUROLOGY, 2017, 59 (12): : 1276 - 1283
  • [29] Assessment of competence in local anaesthetic thoracoscopy: development and validity investigation of a new assessment tool
    Nayahangan, Leizl Joy
    Svendsen, Morten Bo Sondergaard
    Bodtger, Uffe
    Rahman, Najib
    Maskell, Nick
    Sidhu, Jatinder Singh
    Lawaetz, Jonathan
    Clementsen, Paul Frost
    Konge, Lars
    JOURNAL OF THORACIC DISEASE, 2021, 13 (07) : 3998 - 4007
  • [30] Development and Validity Testing of an Arthritis Self-Management Assessment Tool
    Oh, HyunSoo
    Han, SunYoung
    Kim, SooHyun
    Seo, WhaSook
    ORTHOPAEDIC NURSING, 2018, 37 (01) : 24 - 35