Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research

被引:10
|
作者
Strobl, Maximilian A. R. [1 ,2 ]
Lipsmeier, Florian [3 ]
Demenescu, Liliana R. [3 ]
Gossens, Christian [3 ]
Lindemann, Michael [3 ]
De Vos, Maarten [4 ]
机构
[1] Univ Oxford, Radcliffe Observ Quarter, Math Inst, Wolfson Ctr Math Biol, Oxford OX2 6GG, England
[2] H Lee Moffitt Canc Ctr & Res Inst, Dept Integrated Math Oncol, Magnolia Dr, Tampa, FL 33612 USA
[3] F Hoffmann La Roche Ltd, Roche Innovat Ctr, pRED Informat, Roche Pharma Res & Early Dev, Basel, Switzerland
[4] Univ Oxford, Inst Biomed Engn, Dept Engn Sci, Old Rd Campus Res Bldg, Oxford OX3 7DQ, England
基金
英国工程与自然科学研究理事会; 英国医学研究理事会;
关键词
Gaze tracking; Mental disorders; m-Health; Biomedical monitoring;
D O I
10.1186/s12938-019-0670-1
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
BackgroundAvoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis at a manageable cost are missing. In this paper, we investigated whether a smartphone-based tool could address this problem. Specifically, we assessed the accuracy with which the phone-based, state-of-the-art eye-tracking algorithm iTracker can distinguish between gaze towards the eyes and the mouth of a face displayed on the smartphone screen. This might allow mobile, longitudinal monitoring of gaze aversion behaviour in ASD patients in the future.ResultsWe simulated a smartphone application in which subjects were shown an image on the screen and their gaze was analysed using iTracker. We evaluated the accuracy of our set-up across three tasks in a cohort of 17 healthy volunteers. In the first two tasks, subjects were shown different-sized images of a face and asked to alternate their gaze focus between the eyes and the mouth. In the last task, participants were asked to trace out a circle on the screen with their eyes. We confirm that iTracker can recapitulate the true gaze patterns, and capture relative position of gaze correctly, even on a different phone system to what it was trained on. Subject-specific bias can be corrected using an error model informed from the calibration data. We compare two calibration methods and observe that a linear model performs better than a previously proposed support vector regression-based method.ConclusionsUnder controlled conditions it is possible to reliably distinguish between gaze towards the eyes and the mouth with a smartphone-based set-up. However, future research will be required to improve the robustness of the system to roll angle of the phone and distance between the user and the screen to allow deployment in a home setting. We conclude that a smartphone-based gaze-monitoring tool provides promising opportunities for more quantitative monitoring of ASD.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Eye-tracking correlates of response to joint attention in preschool children with autism spectrum disorder
    Ryan Anthony de Belen
    Hannah Pincham
    Antoinette Hodge
    Natalie Silove
    Arcot Sowmya
    Tomasz Bednarz
    Valsamma Eapen
    BMC Psychiatry, 23
  • [42] Predictive Language Processing in Preschool Children with Autism Spectrum Disorder: An Eye-Tracking Study
    Zhou, Peng
    Zhan, Likan
    Ma, Huimin
    JOURNAL OF PSYCHOLINGUISTIC RESEARCH, 2019, 48 (02) : 431 - 452
  • [43] On the Value of Data Loss: A Study of Atypical Attention in Autism Spectrum Disorder Using Eye Tracking
    Wang, Yi-Wen
    Dommer, Kelsey
    Webb, Sara Jane
    Shic, Frederick
    ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2023, 2023,
  • [44] Predictive Language Processing in Preschool Children with Autism Spectrum Disorder: An Eye-Tracking Study
    Peng Zhou
    Likan Zhan
    Huimin Ma
    Journal of Psycholinguistic Research, 2019, 48 : 431 - 452
  • [45] The Effects of Visual Stimuli on Attention in Children With Autism Spectrum Disorder: An Eye-Tracking Study
    Banire, Bilikis
    Al-Thani, Dena
    Qaraqe, Marwa
    Khowaja, Kamran
    Mansoor, Bilal
    IEEE ACCESS, 2020, 8 : 225663 - 225674
  • [46] Validating a Smartphone-Based Pedestrian Navigation System Prototype An Informal Eye-Tracking Pilot Test
    Kluge, Mario
    Asche, Hartmut
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS - ICCSA 2012, PT II, 2012, 7334 : 386 - 396
  • [47] Learning Clusters in Autism Spectrum Disorder: Image-Based Clustering of Eye-Tracking Scanpaths with Deep Autoencode
    Elbattah, Mahmoud
    Carette, Romuald
    Dequen, Gilles
    Guerin, Jean-Luc
    Cilia, Federica
    2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2019, : 1417 - 1420
  • [48] Eye-Tracking Based Autism Spectrum Disorder Diagnosis Using Chaotic Butterfly Optimization with Deep Learning Model
    Thanarajan T.
    Alotaibi Y.
    Rajendran S.
    Nagappan K.
    Computers, Materials and Continua, 2023, 76 (02): : 1995 - 2013
  • [49] Eye-Tracking Based Autism Spectrum Disorder Diagnosis Using Chaotic Butterfly Optimization with Deep Learning Model
    Thanarajan, Tamilvizhi
    Alotaibi, Youseef
    Rajendran, Surendran
    Nagappan, Krishnaraj
    CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 76 (02): : 1995 - 2013
  • [50] ASDNet: A robust involution-based architecture for diagnosis of autism spectrum disorder utilising eye-tracking technology
    Mumenin, Nasirul
    Abu Yousuf, Mohammad
    Nashiry, Md Asif
    Azad, A. K. M.
    Alyami, Salem A.
    Lio', Pietro
    Moni, Mohammad Ali
    IET COMPUTER VISION, 2024, 18 (05) : 666 - 681