Supporting Mitosis Detection AI Training with Inter-Observer Eye-Gaze Consistencies

被引:1
|
作者
Gu, Hongyan [1 ]
Yan, Zihan [2 ]
Alvi, Ayesha [1 ]
Day, Brandon [1 ]
Yang, Chunxu [1 ]
Wu, Zida [1 ]
Magaki, Shino [3 ]
Hacri, Mohammad [4 ]
Chen, Xiang 'Anthony' [1 ]
机构
[1] Univ Calif Los Angeles, Elect & Comp Engn, Los Angeles, CA 90095 USA
[2] Univ Illinois, Informat Programs, Urbana, IL 61801 USA
[3] Univ Calif Los Angeles, David Geffen Sch Med, Pathol & Lab Med, Los Angeles, CA 90095 USA
[4] Univ Kansas, Med Ctr, Pathol & Lab Med, Kansas City, KS 66103 USA
关键词
Eye-Gaze; Consistency; Convolutional Neural Network; Mitosis Detection; Pathology;
D O I
10.1109/ICHI61247.2024.00013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The expansion of artificial intelligence (AI) in pathology tasks has intensified the demand for doctors' annotations in AI development. However, collecting high-quality annotations from doctors is costly and time-consuming, creating a bottleneck in AI progress. This study investigates eye-tracking as a cost-effective technology to collect doctors' behavioral data for AI training with a focus on the pathology task of mitosis detection. One major challenge in using eye-gaze data is the low signal-to-noise ratio, which hinders the extraction of meaningful information. We tackled this by levering the properties of inter-observer eye-gaze consistencies and creating eye-gaze labels from consistent eye-fixations shared by a group of observers. Our study involved 14 non-medical participants, from whom we collected eye-gaze data and generated eye-gaze labels based on varying group sizes. We assessed the efficacy of such eye-gaze labels by training Convolutional Neural Networks (CNNs) and comparing their performance to those trained with ground truth annotations and a heuristic-based baseline. Results indicated that CNNs trained with our eye-gaze labels closely followed the performance of ground-truth-based CNNs, and significantly outperformed the baseline. Although primarily focused on mitosis, we envision that insights from this study can be generalized to other medical imaging tasks.
引用
收藏
页码:40 / 45
页数:6
相关论文
共 50 条
  • [21] Reducing inter-observer variability in embryo evaluation by means of training courses
    Rafael Ruiz de Assin
    Ana Clavero
    Maria Carmen Gonzalvo
    Antonio Rosales
    Sandra Zamora
    Luis Martinez
    Juan Mozas
    Jose Antonio Castilla
    Journal of Assisted Reproduction and Genetics, 2011, 28 : 1129 - 1133
  • [22] A Study on reducing the detection errors of the center of the iris for an eye-gaze interface system
    Yonezawa T.
    Ogata K.
    Matsumoto K.
    Hirase S.
    Shiratani K.
    Kido D.
    Nishimura M.
    IEEJ Transactions on Electronics, Information and Systems, 2010, 130 (03) : 442 - 449+12
  • [23] AI tool decreases inter-observer variability in the analysis of PSMA-PET/CT
    Borrelli, Pablo
    Ulen, Johannes
    Enqvist, Olof
    Edenbrandt, Lars
    Tragardh, Elin
    JOURNAL OF NUCLEAR MEDICINE, 2021, 62
  • [24] Inter-observer variability in APACHE II scoring: effect of strict guidelines and training
    Kees H. Polderman
    Edward M. Jorna
    Armand R. Girbes
    Intensive Care Medicine, 2001, 27 : 1365 - 1369
  • [25] Inter-observer variability in APACHE II scoring: effect of strict guidelines and training
    Polderman, KH
    Jorna, EMF
    Girbes, ARJ
    INTENSIVE CARE MEDICINE, 2001, 27 (08) : 1365 - 1369
  • [26] Effects of Language Proficiency on Eye-gaze in Second Language Conversations: Toward Supporting Second Language Collaboration
    Umata, Ichiro
    Yamamoto, Seiichi
    Ijuin, Koki
    Nishida, Masafumi
    ICMI'13: PROCEEDINGS OF THE 2013 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2013, : 413 - 419
  • [27] Inter-observer Variability of Clinical Activity Score: Assessments in Patients With Thyroid Eye Disease
    Perros, Petros
    Zarkovi, Milos
    Pearce, Simon H.
    Razvi, Salman
    Kolli, Hema
    Dickinson, A. Jane
    AMERICAN JOURNAL OF OPHTHALMOLOGY, 2023, 252 : 94 - 100
  • [28] Connecting the Brains via Virtual Eyes : Eye-Gaze Directions and Inter-brain Synchrony in VR
    Gumilar, Ihshan
    Barde, Amit
    Hayati, Ashkan F.
    Billinghurst, Mark
    Lee, Gun
    Momin, Abdul
    Averill, Charles
    Dey, Arindam
    EXTENDED ABSTRACTS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'21), 2021,
  • [29] Enhanced speaker diarization with detection of backchannels using eye-gaze information in poster conversations
    Inoue, Koji
    Wakabayashi, Yukoh
    Yoshimoto, Hiromasa
    Takanashi, Katsuya
    Kawahara, Tatsuya
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 3086 - 3090
  • [30] Eye-gaze detection from monocular camera image using parametric template matching
    Ohtera, Ryo
    Horiuchi, Takahiko
    Tominaga, Shoji
    COMPUTER VISION - ACCV 2007, PT I, PROCEEDINGS, 2007, 4843 : 708 - 717