Inter-rater Agreement Between Exposure Assessment Using Automatic Algorithms and Using Experts

被引:7
|
作者
Florath, Ines [1 ]
Glass, Deborah C. [2 ]
Rhazi, Mounia Senhaji [3 ]
Parent, Marie-Elise [3 ]
Fritschi, Lin [1 ]
机构
[1] Curtin Univ, Sch Publ Hlth, Kent St, Perth, WA, Australia
[2] Monash Univ, Dept Epidemiol & Prevent Med, 553 St Kilda Rd, Melbourne, Vic, Australia
[3] Univ Quebec, INRS Inst Armand Frappier, 531 Blvd Prairies, Laval, PQ, Canada
基金
澳大利亚国家健康与医学研究理事会;
关键词
asthma; epidemiology methodology; expert judgement; exposure assessment; inter-rater reliability; DIESEL EXHAUST EXPOSURE; ADULT-ONSET ASTHMA; OCCUPATIONAL-EXPOSURE; LUNG-CANCER; RETROSPECTIVE ASSESSMENT; CARCINOGENS; ASBESTOS; RELIABILITY; PREVALENCE; KAPPA;
D O I
10.1093/annweh/wxy084
中图分类号
R1 [预防医学、卫生学];
学科分类号
1004 ; 120402 ;
摘要
Objectives To estimate the inter-rater agreement between exposure assessment to asthmagens in current jobs by algorithms based on task-based questionnaires (OccIDEAS) and by experts. Methods Participants in a cross-sectional national survey of exposure to asthmagens (AWES-Asthma) were randomly split into two subcohorts of equal size. Subcohort 1 was used to determine the most common asthmagen groups and occupational groups. From subcohort 2, a random sample of 200 participants was drawn and current occupational exposure (yes/no) was assessed in these by OccIDEAS and by two experts independently and then as a consensus. Inter-rater agreement was estimated using Cohen's Kappa coefficient. The null hypothesis was set at 0.4, because both the experts and the automatic algorithm assessed the exposure using the same task-based questionnaires and therefore an agreement better than by chance would be expected. Results The Kappa coefficients for the agreement between the experts and the algorithm-based assessments ranged from 0.37 to 1, while the agreement between the two experts ranged from 0.29 to 0.94, depending on the agent being assessed. After discussion by both experts the Kappa coefficients for the consensus decision and OccIDEAS were significantly larger than 0.4 for 7 of the 10 asthmagen groups, while overall the inter-rater agreement was greater than by chance (P < 0.0001). Conclusions The web-based application OccIDEAS is an appropriate tool for automated assessment of current exposure to asthmagens (yes/no), and requires less time-consuming work by highly-qualified research personnel than the traditional expert-based method. Further, it can learn and reuse expert determinations in future studies.
引用
收藏
页码:45 / 53
页数:9
相关论文
共 50 条
  • [1] Comparison between Inter-rater Reliability and Inter-rater Agreement in Performance Assessment
    Liao, Shih Chieh
    Hunt, Elizabeth A.
    Chen, Walter
    ANNALS ACADEMY OF MEDICINE SINGAPORE, 2010, 39 (08) : 613 - 618
  • [2] Inter-rater agreement in the assessment of exposure to carcinogens in the offshore petroleum industry
    Steinsvag, Kjersti
    Bratveit, Magne
    Moen, Bente E.
    Kromhout, Hans
    OCCUPATIONAL AND ENVIRONMENTAL MEDICINE, 2007, 64 (09) : 582 - 588
  • [3] Intra- and inter-rater agreement in the assessment of occupational exposure to metals
    Rybicki, BA
    Peterson, EL
    Johnson, CC
    Kortsha, GX
    Cleary, WM
    Gorell, JM
    INTERNATIONAL JOURNAL OF EPIDEMIOLOGY, 1998, 27 (02) : 269 - 273
  • [4] Gauging the Quality of Relevance Assessments using Inter-Rater Agreement
    Damessie, Tadele T.
    Nghiem, Thao P.
    Scholer, Falk
    Culpepper, J. Shane
    SIGIR'17: PROCEEDINGS OF THE 40TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2017, : 1089 - 1092
  • [5] Assessment of the cervix after acetic acid wash: Inter-rater agreement using photographs
    Sellors, JW
    Jeronimo, J
    Sankaranarayanan, R
    Wright, TC
    Howard, M
    Blumenthal, PD
    OBSTETRICS AND GYNECOLOGY, 2002, 99 (04): : 635 - 640
  • [6] Inter-rater agreement: a methodological issue
    Shahsavari, Meisam
    Shahsavari, Soodeh
    JOURNAL OF NEUROSURGERY, 2019, 131 (02) : 651 - 651
  • [7] Bayesian analysis for inter-rater agreement
    Broemeling, LD
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2001, 30 (03) : 437 - 446
  • [8] Inter-rater agreement on assessment of outcome within a trauma registry
    Ekegren, C. L.
    Hart, M. J.
    Brown, A.
    Gabbe, B. J.
    INJURY-INTERNATIONAL JOURNAL OF THE CARE OF THE INJURED, 2016, 47 (01): : 130 - 134
  • [9] Evaluation of inter-rater agreement between three causality assessment methods used in pharmacovigilance
    Thaker, Saket J.
    Sinha, Rahul S.
    Gogtay, Nithya J.
    Thatte, Urmila M.
    JOURNAL OF PHARMACOLOGY & PHARMACOTHERAPEUTICS, 2016, 7 (01) : 31 - 33
  • [10] Inter-rater agreement and checklist validation for postoperative wound assessment using smartphone images in vascular surgery
    Wiseman, Jason T.
    Fernandes-Taylor, Sara
    Gunter, Rebecca
    Barnes, Maggie L.
    Saunders, Richard Scott
    Rathouz, Paul J.
    Yamanouchi, Dai
    Kent, K. Craig
    Journal of Vascular Surgery-Venous and Lymphatic Disorders, 2016, 4 (03) : 320 - U79