Detecting Task Difficulty of Learners in Colonoscopy: Evidence from Eye-Tracking

被引:3
|
作者
Liu, Xin [1 ,2 ]
Zheng, Bin [2 ]
Duan, Xiaoqin [2 ,3 ]
He, Wenjing [4 ]
Li, Yuandong [5 ]
Zhao, Jinyu [2 ]
Zhao, Chen [1 ,6 ]
Wang, Lin [2 ]
机构
[1] Univ Sci & Technol Beijing, Sch Comp & Commun Engn, Beijing, Peoples R China
[2] Univ Alberta, Dept Surg, Surg Simulat Res Lab, Edmonton, AB, Canada
[3] Jilin Univ, Hosp 2, Dept Rehabil Med, Changchun, Jilin, Peoples R China
[4] Univ Manitoba, Dept Surg, Winnipeg, MB, Canada
[5] Shanxi Bethune Hosp, Dept Surg, Taiyuan, Shanxi, Peoples R China
[6] Beijing Key Lab Knowledge Engn Mat Sci, Beijing, Peoples R China
来源
JOURNAL OF EYE MOVEMENT RESEARCH | 2021年 / 14卷 / 02期
基金
中国国家自然科学基金; 加拿大自然科学与工程研究理事会;
关键词
colonoscopy; simulation; eye-tracking; navigation; Deep Convolutional Generative Adversarial Networks (DCGANs); Long Short-Term Memory (LSTM); VALIDATION; NETWORKS;
D O I
10.16910/jemr.14.2.5
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Eye-tracking can help decode the intricate control mechanism in human performance. In healthcare, physicians-in-training require extensive practice to improve their healthcare skills. When a trainee encounters any difficulty in the practice, they will need feedback from experts to improve their performance. Personal feedback is time-consuming and subjected to bias. In this study, we tracked the eye movements of trainees during their colonoscopic performance in simulation. We examined changes in eye movement behav-ior during the moments of navigation loss (MNL), a signature sign for task difficulty during colonoscopy, and tested whether deep learning algorithms can detect the MNL by feeding data from eye-tracking. Human eye gaze and pupil characteristics were learned and verified by the deep convolutional generative adversarial networks (DCGANs); the generated data were fed to the Long Short-Term Memory (LSTM) networks with three different data feeding strategies to classify MNLs from the entire colonoscopic procedure. Outputs from deep learning were compared to the expert's judgment on the MNLs based on colonoscopic videos. The best classification outcome was achieved when we fed hu-man eye data with 1000 synthesized eye data, where accuracy (91.80%), sensitivity (90.91%), and specificity (94.12%) were optimized. This study built an important founda-tion for our work of developing an education system for training healthcare skills using simulation.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Social anxiety and difficulty disengaging threat: Evidence from eye-tracking
    Schofield, Casey A.
    Johnson, Ashley L.
    Inhoff, Albrecht W.
    Coles, Meredith E.
    [J]. COGNITION & EMOTION, 2012, 26 (02) : 300 - 311
  • [2] Integration and prediction difficulty in Hindi sentence comprehension: Evidence from an eye-tracking corpus
    Husain, Samar
    Vasishth, Shravan
    Srinivasan, Narayanan
    [J]. JOURNAL OF EYE MOVEMENT RESEARCH, 2015, 8 (02):
  • [3] Chinese EFL learners different from English natives in cataphora resolution: Evidence from eye-tracking studies
    Wang, Tao
    Geng, Mingyao
    Wang, Yue
    Zhao, Min
    Zhou, Tongquan
    Yang, Yiming
    [J]. FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [4] THE EFFECT OF GLOSS TYPE ON LEARNERS' INTAKE OF NEW WORDS DURING READING EVIDENCE FROM EYE-TRACKING
    Warren, Paul
    Boers, Frank
    Grimshaw, Gina
    Siyanova-Chanturia, Anna
    [J]. STUDIES IN SECOND LANGUAGE ACQUISITION, 2018, 40 (04) : 883 - 906
  • [5] Adaptive processing of fractions - Evidence from eye-tracking
    Huber, S.
    Moeller, K.
    Nuerk, H-C.
    [J]. ACTA PSYCHOLOGICA, 2014, 148 : 37 - 48
  • [6] Processing adjectives in development: Evidence from eye-tracking
    Redolfi, Michela
    Melloni, Chiara
    [J]. JOURNAL OF CHILD LANGUAGE, 2024,
  • [7] Predicting Spatial Visualization Problems' Difficulty Level from Eye-Tracking Data
    Li, Xiang
    Younes, Rabih
    Bairaktarova, Diana
    Guo, Qi
    [J]. SENSORS, 2020, 20 (07)
  • [8] Detecting Eye Contact using Wearable Eye-Tracking Glasses
    Ye, Zhefan
    Li, Yin
    Fathi, Alireza
    Han, Yi
    Rozga, Agata
    Abowd, Gregory D.
    Rehg, James M.
    [J]. UBICOMP'12: PROCEEDINGS OF THE 2012 ACM INTERNATIONAL CONFERENCE ON UBIQUITOUS COMPUTING, 2012, : 699 - 704
  • [9] Detecting and Identifying Real and Decoy Tanks in a Computer Screen: Evidence from Stimuli Sensitivity and Eye-Tracking
    Kallinen, Kari
    [J]. HCI INTERNATIONAL 2019 - LATE BREAKING POSTERS, HCII 2019, 2019, 1088 : 108 - 112
  • [10] Should learners use their hands for learning? Results from an eye-tracking study
    Korbach, Andreas
    Ginns, Paul
    Bruenken, Roland
    Park, Babette
    [J]. JOURNAL OF COMPUTER ASSISTED LEARNING, 2020, 36 (01) : 102 - 113