An assessment of the inter-rater and intra-rater reliability of the modified Gordon pin infection classification system

被引:0
|
作者
Bafor, Anirejuoritse [1 ]
Skals, Regitze Gyldenholm [2 ]
Shen, Ming [3 ]
Iobst, Christopher A. [1 ]
Rahbek, Ole [4 ]
Kold, Soren [4 ]
Fridberg, Marie [4 ]
机构
[1] Nationwide Childrens Hosp, Ctr Limb Lengthening & Reconstruct, Dept Orthopaed, Columbus, OH 43205 USA
[2] Aalborg Univ Hosp Res Data & Biostat, Aalborg, Denmark
[3] Aalborg Univ, Dept Elect Syst, Aalborg, Denmark
[4] Aalborg Univ Hosp, Dept Orthopaed, Interdisciplinary Orthopaed, Aalborg, Denmark
来源
DIGITAL HEALTH | 2024年 / 10卷
关键词
Modified Gordon score; inter-rater reliability; pin-site infection; pin-site labeling; EXTERNAL FIXATION; GRADING SYSTEM; SITE INFECTION; COMPLICATIONS;
D O I
10.1177/20552076241277672
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Objectives A grading system deployed for continuous at-home monitoring of pin sites would potentially increase the chances of early detection of pin-site infections and the commencement of early treatment. The first five grades of the Modified Gordon Pin Site Classification Scheme (MGS) meet the criteria for a visual-only, digital assessment-based grading system. The aim of this study was to assess the inter- and intra-rater reliability of the first five grades of the MGS from digital images.Methods We graded 1082 pin sites from 572 digital photographs of patients who underwent external fixator treatment for various conditions using the first five grades of the MGS classification scheme. Percent agreement and kappa values were calculated to determine the inter- and intra-rater agreement. Results were also grouped into two categories: "good" consisting of MGS grades 0-2 and "bad" made up of grades 3 and 4 for sensitivity analysis. We also analyzed reliability based on color only using MGS grades 0 and 2.Results A total of 843 of the 1082 pin sites were scored by all raters. There was moderate reliability between raters with a Fleiss kappa value of 0.48 [CI 0.45, 0.51]. The reliability remained moderate based on grouping into "good' versus "bad" and based on color with Fleiss kappa values of 0.48 [CI 0.45, 0.52] and 0.45 [CI 0.42, 0.49], respectively. Intra-rater reliability demonstrated substantial agreement with kappa values of 0.63.Conclusion Scoring pin sites from digital images with the MGS demonstrated only moderate inter-rater reliability. Modifying the use of digital photos is needed for at-home monitoring of pin sites.
引用
收藏
页数:7
相关论文
共 50 条
  • [31] Inter-rater and Intra-rater Reliability of a Mobile App Method to Measure Lumbar Lordosis
    Gnanasigamani, Jency Thangasheela
    Ramalingam, Vinodhkumar
    CUREUS JOURNAL OF MEDICAL SCIENCE, 2024, 16 (03)
  • [32] Inter-rater and intra-rater reliability of Kinovea software for measurement of shoulder range of motion
    Reham M. Abd Elrahim
    Eman A. Embaby
    Mohamed F. Ali
    Ragia M. Kamel
    Bulletin of Faculty of Physical Therapy, 2016, 21 (2) : 80 - 87
  • [33] Inter-Rater and Intra-Rater Reliability of the Bahasa Melayu Version of Rose Angina Questionnaire
    Hassan, N. B.
    Choudhury, S. R.
    Naing, L.
    Conroy, R. M.
    Rahman, A. R. A.
    ASIA-PACIFIC JOURNAL OF PUBLIC HEALTH, 2007, 19 (03) : 45 - 51
  • [34] Inter-rater and intra-rater reliability outcomes of a rapid bacteria counting system with pressure ulcer samples
    Nakagami, G.
    Mori, M.
    Yoshida, M.
    Kitamura, A.
    Hayashi, A.
    Miyagaki, T.
    Sasaki, S.
    Sugama, J.
    Sanada, H.
    JOURNAL OF WOUND CARE, 2017, 26 (02) : S27 - S31
  • [35] Fragility Fractures of the Pelvis Classification A Multicenter Assessment of the Intra-Rater and Inter-Rater Reliabilities and Percentage of Agreement
    Pieroh, Philipp
    Hoech, Andreas
    Hohmann, Tim
    Gras, Florian
    Maerdian, Sven
    Pflug, Alexander
    Wittenberg, Silvan
    Ihle, Christoph
    Blankenburg, Notker
    Dallacker-Losensky, Kevin
    Schroeder, Tanja
    Herath, Steven C.
    Wagner, Daniel
    Palm, Hans-Georg
    Josten, Christoph
    Stuby, Fabian M.
    JOURNAL OF BONE AND JOINT SURGERY-AMERICAN VOLUME, 2019, 101 (11): : 987 - 994
  • [36] An evaluation of the inter-rater and intra-rater reliability of OccuPro's functional capacity evalution
    Scheel, Carrie
    Mecham, Jim
    Zuccarello, Vic
    Mattes, Ryan
    WORK-A JOURNAL OF PREVENTION ASSESSMENT & REHABILITATION, 2018, 60 (03): : 465 - 473
  • [37] Intra-rater and inter-rater reliability of the standardized ultrasound protocol for assessing subacromial structures
    Hougs Kjaer, Birgitte
    Ellegaard, Karen
    Wieland, Ina
    Warming, Susan
    Juul-Kristensen, Birgit
    PHYSIOTHERAPY THEORY AND PRACTICE, 2017, 33 (05) : 398 - 409
  • [38] INTER-RATER AND INTRA-RATER AGREEMENT OF THE REHABILITATION ACTIVITIES PROFILE
    JELLES, F
    VANBENNEKOM, CAM
    LANKHORST, GJ
    SIBBEL, CJP
    BOUTER, LM
    JOURNAL OF CLINICAL EPIDEMIOLOGY, 1995, 48 (03) : 407 - 416
  • [39] INTER-RATER, INTRA-RATER, AND INTER-MACHINE RELIABILITY OF QUANTITATIVE ULTRASOUND MEASUREMENTS OF THE PATELLAR TENDON
    Gellhorn, Alfred C.
    Carlson, M. Jake
    ULTRASOUND IN MEDICINE AND BIOLOGY, 2013, 39 (05): : 791 - 796
  • [40] Inter-rater, intra-rater and inter-instrument reliability of an electrogoniometer to measure wrist range of motion
    da Silva Camassuti, Patricia Aparecida
    Marcolino, Alexandre Marcio
    Tamanini, Guilherme
    Barbosa, Rafael Inacio
    Barbosa, Amanda Matias
    Registro Fonseca, Marisa de Cassia
    HAND THERAPY, 2015, 20 (01) : 3 - 10