Inter-rater Agreement Between Exposure Assessment Using Automatic Algorithms and Using Experts

被引:7
|
作者
Florath, Ines [1 ]
Glass, Deborah C. [2 ]
Rhazi, Mounia Senhaji [3 ]
Parent, Marie-Elise [3 ]
Fritschi, Lin [1 ]
机构
[1] Curtin Univ, Sch Publ Hlth, Kent St, Perth, WA, Australia
[2] Monash Univ, Dept Epidemiol & Prevent Med, 553 St Kilda Rd, Melbourne, Vic, Australia
[3] Univ Quebec, INRS Inst Armand Frappier, 531 Blvd Prairies, Laval, PQ, Canada
基金
澳大利亚国家健康与医学研究理事会;
关键词
asthma; epidemiology methodology; expert judgement; exposure assessment; inter-rater reliability; DIESEL EXHAUST EXPOSURE; ADULT-ONSET ASTHMA; OCCUPATIONAL-EXPOSURE; LUNG-CANCER; RETROSPECTIVE ASSESSMENT; CARCINOGENS; ASBESTOS; RELIABILITY; PREVALENCE; KAPPA;
D O I
10.1093/annweh/wxy084
中图分类号
R1 [预防医学、卫生学];
学科分类号
1004 ; 120402 ;
摘要
Objectives To estimate the inter-rater agreement between exposure assessment to asthmagens in current jobs by algorithms based on task-based questionnaires (OccIDEAS) and by experts. Methods Participants in a cross-sectional national survey of exposure to asthmagens (AWES-Asthma) were randomly split into two subcohorts of equal size. Subcohort 1 was used to determine the most common asthmagen groups and occupational groups. From subcohort 2, a random sample of 200 participants was drawn and current occupational exposure (yes/no) was assessed in these by OccIDEAS and by two experts independently and then as a consensus. Inter-rater agreement was estimated using Cohen's Kappa coefficient. The null hypothesis was set at 0.4, because both the experts and the automatic algorithm assessed the exposure using the same task-based questionnaires and therefore an agreement better than by chance would be expected. Results The Kappa coefficients for the agreement between the experts and the algorithm-based assessments ranged from 0.37 to 1, while the agreement between the two experts ranged from 0.29 to 0.94, depending on the agent being assessed. After discussion by both experts the Kappa coefficients for the consensus decision and OccIDEAS were significantly larger than 0.4 for 7 of the 10 asthmagen groups, while overall the inter-rater agreement was greater than by chance (P < 0.0001). Conclusions The web-based application OccIDEAS is an appropriate tool for automated assessment of current exposure to asthmagens (yes/no), and requires less time-consuming work by highly-qualified research personnel than the traditional expert-based method. Further, it can learn and reuse expert determinations in future studies.
引用
收藏
页码:45 / 53
页数:9
相关论文
共 50 条
  • [31] Weighted inter-rater agreement measures for ordinal outcomes
    Tran, Duyet
    Dolgun, Anil
    Demirhan, Haydar
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2020, 49 (04) : 989 - 1003
  • [32] Inter-rater Agreement Indices for Multiple Informant Methodology
    Lanz, Margherita
    Sorgente, Angela
    Tagliabue, Semira
    MARRIAGE AND FAMILY REVIEW, 2018, 54 (02): : 148 - 182
  • [33] Inter-rater agreement in judging errors in diagnostic reasoning
    Hasnain, M
    Onishi, H
    Elstein, AS
    MEDICAL EDUCATION, 2004, 38 (06) : 609 - 616
  • [34] Assessment of inter-rater agreement between physicians and their patients regarding medication adherence in a clinical questionnaire study
    Otsu, Yuki
    Kai, Mamiko
    Suematsu, Yasunori
    Kiyomi, Fumiaki
    Saku, Keijiro
    Kamimura, Hidetoshi
    Miura, Shin-ichiro
    MEDICINE, 2019, 98 (18)
  • [35] Intra- and Inter-Rater Agreement Describing Myometrial Lesions Using Morphologic Uterus Sonographic Assessment: A Pilot Study
    Rasmussen, Christina Kjaergaard
    Van den Bosch, Thierry
    Exacoustos, Caterina
    Manegold-Brauer, Gwendolin
    Benacerraf, Beryl R.
    Froyman, Wouter
    Landolfo, Chiara
    Condorelli, Margherita
    Egekvist, Anne G.
    Josefsson, Hampus
    Leone, Francesco Paolo Giuseppe
    Jokubkiene, Ligita
    Zannoni, Letizia
    Epstein, Elisabeth
    Installe, Arnaud
    Dueholm, Margit
    JOURNAL OF ULTRASOUND IN MEDICINE, 2019, 38 (10) : 2673 - 2683
  • [36] Inter-rater agreement for diagnoses of epilepsy in pregnant women
    Khoshbin, Shahram
    Herring, Amy
    Holmes, Gregory L.
    Schomer, Donald
    Hoch, Daniel
    Dooling, Elizabeth C.
    Vining, Eileen P. G.
    Holmes, Lewis B.
    EPILEPSY & BEHAVIOR, 2013, 27 (01) : 148 - 153
  • [37] Inter-rater agreement of the triagesystem RETTS-HEV
    Louise Nissen
    Hans Kirkegaard
    Noel Perez
    Ulf Hørlyck
    Louise Pape
    Scandinavian Journal of Trauma, Resuscitation and Emergency Medicine, 21 (Suppl 2)
  • [39] INTER-RATER AGREEMENT BETWEEN RESEARCHERS WHEN OBSERVING SIGNS OF TOILET READINESS
    Kaerts, N.
    Vermandel, A.
    Lierman, F.
    Van Gestel, A.
    Wyndaele, J.
    NEUROUROLOGY AND URODYNAMICS, 2011, 30 (06) : 948 - 949
  • [40] Inter-rater agreement of observable and elicitable neurological signs
    Thaller, Mark
    Hughes, Thomas
    CLINICAL MEDICINE, 2014, 14 (03) : 264 - 267