Inter-rater Agreement Between Exposure Assessment Using Automatic Algorithms and Using Experts

被引:7
|
作者
Florath, Ines [1 ]
Glass, Deborah C. [2 ]
Rhazi, Mounia Senhaji [3 ]
Parent, Marie-Elise [3 ]
Fritschi, Lin [1 ]
机构
[1] Curtin Univ, Sch Publ Hlth, Kent St, Perth, WA, Australia
[2] Monash Univ, Dept Epidemiol & Prevent Med, 553 St Kilda Rd, Melbourne, Vic, Australia
[3] Univ Quebec, INRS Inst Armand Frappier, 531 Blvd Prairies, Laval, PQ, Canada
基金
澳大利亚国家健康与医学研究理事会;
关键词
asthma; epidemiology methodology; expert judgement; exposure assessment; inter-rater reliability; DIESEL EXHAUST EXPOSURE; ADULT-ONSET ASTHMA; OCCUPATIONAL-EXPOSURE; LUNG-CANCER; RETROSPECTIVE ASSESSMENT; CARCINOGENS; ASBESTOS; RELIABILITY; PREVALENCE; KAPPA;
D O I
10.1093/annweh/wxy084
中图分类号
R1 [预防医学、卫生学];
学科分类号
1004 ; 120402 ;
摘要
Objectives To estimate the inter-rater agreement between exposure assessment to asthmagens in current jobs by algorithms based on task-based questionnaires (OccIDEAS) and by experts. Methods Participants in a cross-sectional national survey of exposure to asthmagens (AWES-Asthma) were randomly split into two subcohorts of equal size. Subcohort 1 was used to determine the most common asthmagen groups and occupational groups. From subcohort 2, a random sample of 200 participants was drawn and current occupational exposure (yes/no) was assessed in these by OccIDEAS and by two experts independently and then as a consensus. Inter-rater agreement was estimated using Cohen's Kappa coefficient. The null hypothesis was set at 0.4, because both the experts and the automatic algorithm assessed the exposure using the same task-based questionnaires and therefore an agreement better than by chance would be expected. Results The Kappa coefficients for the agreement between the experts and the algorithm-based assessments ranged from 0.37 to 1, while the agreement between the two experts ranged from 0.29 to 0.94, depending on the agent being assessed. After discussion by both experts the Kappa coefficients for the consensus decision and OccIDEAS were significantly larger than 0.4 for 7 of the 10 asthmagen groups, while overall the inter-rater agreement was greater than by chance (P < 0.0001). Conclusions The web-based application OccIDEAS is an appropriate tool for automated assessment of current exposure to asthmagens (yes/no), and requires less time-consuming work by highly-qualified research personnel than the traditional expert-based method. Further, it can learn and reuse expert determinations in future studies.
引用
收藏
页码:45 / 53
页数:9
相关论文
共 50 条
  • [21] A spreadsheet for the calculation of comprehensive statistics for the assessment of diagnostic tests and inter-rater agreement
    Mackinnon, A
    COMPUTERS IN BIOLOGY AND MEDICINE, 2000, 30 (03) : 127 - 134
  • [22] Inter-Rater Agreement of Biofield Tuning: Testing a Novel Health Assessment Procedure
    Hammerschlag, Richard
    McKusick, Eileen D.
    Bat, Namuun
    Muehsam, David J.
    McNames, James
    Jain, Shamini
    JOURNAL OF ALTERNATIVE AND COMPLEMENTARY MEDICINE, 2020, 26 (10) : 911 - 917
  • [23] Inter-rater agreement in visual assessment of footpad dermatitis in Danish broiler chickens
    Oliveira, A. R. S.
    Lund, V. P.
    Christensen, J. P.
    Nielsen, L. R.
    BRITISH POULTRY SCIENCE, 2017, 58 (03) : 224 - 229
  • [24] An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
    Konstantinidis, Menelaos
    Le, Lisa. W.
    Gao, Xin
    SYMMETRY-BASEL, 2022, 14 (02):
  • [25] INTER-RATER AND INTRA-RATER AGREEMENT OF THE REHABILITATION ACTIVITIES PROFILE
    JELLES, F
    VANBENNEKOM, CAM
    LANKHORST, GJ
    SIBBEL, CJP
    BOUTER, LM
    JOURNAL OF CLINICAL EPIDEMIOLOGY, 1995, 48 (03) : 407 - 416
  • [26] Inter-rater agreement between humans and computer in quantitative assessment of computed tomography after cardiac arrest
    Kenda, Martin
    Cheng, Zhuo
    Guettler, Christopher
    Storm, Christian
    Ploner, Christoph J. J.
    Leithner, Christoph
    Scheel, Michael
    FRONTIERS IN NEUROLOGY, 2022, 13
  • [27] Assessing the inter-rater reliability and agreement of a hybrid assessment framework in the Emergency Department
    Gan, Han Nee
    Tiah, Ling
    Lee, Shu Woan
    Lee, Wee Yee
    EUROPEAN JOURNAL OF EMERGENCY MEDICINE, 2016, 23 (06) : 430 - 434
  • [28] Inter-Rater Agreement of Vibration and Pressure Sensation in the Assessment of Sensorimotor Dysfunction in Diabetes
    Jude, Edward B.
    Tentolouris, Anastasios
    Eleftheriadou, Ioanna
    Tentolouris, Nikolaos
    DIABETES, 2020, 69
  • [29] Inter-Rater Reliability and Agreement Among Mass-Casualty Incident Algorithms Using a Pediatric Trauma Dataset: A Pilot Study
    Fisher, Emilia H.
    Claudius, Ilene
    Kaji, Amy H.
    Shaban, Alaa
    McGlynn, Nicholas
    Cicero, Mark X.
    Santillanes, Genevieve
    Gausche-Hill, Marianne
    Chang, Todd P.
    Donofrio-Odmann, J. Joelle
    PREHOSPITAL AND DISASTER MEDICINE, 2022, 37 (03) : 306 - 313