Intra- and Interobserver Reliability of the Eaton Classification for Trapeziometacarpal Arthritis: A Systematic Review

被引:43
|
作者
Berger, Aaron J. [1 ]
Momeni, Arash [1 ]
Ladd, Amy L. [2 ]
机构
[1] Stanford Univ, Med Ctr, Div Plast & Reconstruct Surg, Palo Alto, CA 94304 USA
[2] Stanford Univ, Med Ctr, Dept Orthopaed Surg, Robert A Chase Hand & Upper Limb Ctr, Palo Alto, CA 94304 USA
关键词
THUMB CARPOMETACARPAL JOINT; OSTEOARTHRITIS; PREVALENCE; WOMEN;
D O I
10.1007/s11999-013-3208-z
中图分类号
R826.8 [整形外科学]; R782.2 [口腔颌面部整形外科学]; R726.2 [小儿整形外科学]; R62 [整形外科学(修复外科学)];
学科分类号
摘要
Trapeziometacarpal, or thumb carpometacarpal (CMC), arthritis is a common problem with a variety of treatment options. Although widely used, the Eaton radiographic staging system for CMC arthritis is of questionable clinical utility, as disease severity does not predictably correlate with symptoms or treatment recommendations. A possible reason for this is that the classification itself may not be reliable, but the literature on this has not, to our knowledge, been systematically reviewed. We therefore performed a systematic review to determine the intra- and interobserver reliability of the Eaton staging system. We systematically reviewed English-language studies published between 1973 and 2013 to assess the degree of intra- and interobserver reliability of the Eaton classification for determining the stage of trapeziometacarpal joint arthritis and pantrapezial arthritis based on plain radiographic imaging. Search engines included: PubMed, Scopus(A (R)), and CINAHL. Four studies, which included a total of 163 patients, met our inclusion criteria and were evaluated. The level of evidence of the studies included in this analysis was determined using the Oxford Centre for Evidence Based Medicine Levels of Evidence Classification by two independent observers. A limited number of studies have been performed to assess intra- and interobserver reliability of the Eaton classification system. The four studies included were determined to be Level 3b. These studies collectively indicate that the Eaton classification demonstrates poor to fair interobserver reliability (kappa values: 0.11-0.56) and fair to moderate intraobserver reliability (kappa values: 0.54-0.657). Review of the literature demonstrates that radiographs assist in the assessment of CMC joint disease, but there is not a reliable system for classification of disease severity. Currently, diagnosis and treatment of thumb CMC arthritis are based on the surgeon's qualitative assessment combining history, physical examination, and radiographic evaluation. Inconsistent agreement using the current common radiographic classification system suggests a need for better radiographic tools to quantify disease severity.
引用
收藏
页码:1155 / 1159
页数:5
相关论文
共 50 条
  • [21] The effect of labeling photo documents in wrist arthroscopies on intra- and interobserver reliability
    Steffen Löw
    Dirk Herold
    Marion Mühldorfer-Fodor
    Thomas Pillukat
    [J]. Archives of Orthopaedic and Trauma Surgery, 2012, 132 : 1813 - 1818
  • [22] Intra- and interobserver reliability of glenoid fracture classifications by Ideberg, Euler and AO
    Gilbert, F.
    Eden, L.
    Meffert, R.
    Konietschke, F.
    Lotz, J.
    Bauer, L.
    Staab, W.
    [J]. BMC MUSCULOSKELETAL DISORDERS, 2018, 19
  • [23] Intra- and interobserver reliability of glenoid fracture classifications by Ideberg, Euler and AO
    F. Gilbert
    L. Eden
    R. Meffert
    F. Konietschke
    J. Lotz
    L. Bauer
    W. Staab
    [J]. BMC Musculoskeletal Disorders, 19
  • [24] Intra- and Interobserver Agreement in the Classification and Treatment of Distal Third Clavicle Fractures
    Bishop, Julie Y.
    Jones, Grant L.
    Lewis, Brian
    Pedroza, Angela
    [J]. AMERICAN JOURNAL OF SPORTS MEDICINE, 2015, 43 (04): : 979 - 984
  • [25] Intra- and interobserver variation in the use of the Vienna classification of Crohn's disease
    Riis, L
    Munkholm, P
    Binder, V
    Skovgaard, LT
    Langholz, E
    [J]. INFLAMMATORY BOWEL DISEASES, 2005, 11 (07) : 657 - 661
  • [26] Intrarater and interrater reliability of the Eaton classification of basal joint arthritis
    Kubik, N
    Lubahn, JD
    [J]. JOURNAL OF HAND SURGERY-AMERICAN VOLUME, 2002, 27A (05): : 882 - 885
  • [27] Reliability of radiographic criteria for trochlear dysplasia: intra- and interobserver analysis on 68 knees
    Remy, F
    Besson, A
    Migaud, H
    Cotten, A
    Gougeon, F
    Duquennoy, A
    [J]. REVUE DE CHIRURGIE ORTHOPEDIQUE ET REPARATRICE DE L APPAREIL MOTEUR, 1998, 84 (08): : 728 - 733
  • [28] Reliability and reproducibility of SmartLens®, a contact lens tonometer:: Intra- and interobserver variability.
    Vogel, A
    Beck, S
    Schwenn, O
    Grus, F
    Krummenauer, F
    Pfeiffer, N
    [J]. INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2001, 42 (04) : S821 - S821
  • [29] Comparison of intra- and interobserver reliability with computed and conventional radiography for spinal deformity.
    McGahan, P. J.
    Kuo, C.
    Gupta, M. C.
    [J]. JOURNAL OF INVESTIGATIVE MEDICINE, 2007, 55 (01) : S105 - S105
  • [30] Ultrasonographic measurement of patellar tendon thickness-a study of intra- and interobserver reliability
    Skou, Soren T.
    Aalkjaer, Johanne M.
    [J]. CLINICAL IMAGING, 2013, 37 (05) : 934 - 937