Feasibility of Longitudinal Eye-Gaze Tracking in the Workplace

被引:3
|
作者
Hutt S. [1 ]
Stewart A.E.B. [2 ]
Gregg J. [3 ]
Mattingly S. [4 ]
D'mello S.K. [3 ]
机构
[1] University of Pennsylvania, Philadelphia, PA
[2] Carnegie Mellon University, Pittsburgh, PA
[3] University of Colorado at Boulder, Boulder, CO
[4] University of Notre Dame, Notre Dame, IN
基金
美国国家科学基金会;
关键词
eye gaze; longitudinal data collection; workplace;
D O I
10.1145/3530889
中图分类号
学科分类号
摘要
Eye movements provide a window into cognitive processes, but much of the research harnessing this data has been confined to the laboratory. We address whether eye gaze can be passively, reliably, and privately recorded in real-world environments across extended timeframes using commercial-off-the-shelf (COTS) sensors. We recorded eye gaze data from a COTS tracker embedded in participants (N=20) work environments at pseudorandom intervals across a two-week period. We found that valid samples were recorded approximately 30% of the time despite calibrating the eye tracker only once and without placing any other restrictions on participants. The number of valid samples decreased over days with the degree of decrease dependent on contextual variables (i.e., frequency of video conferencing) and individual difference attributes (e.g., sleep quality and multitasking ability). Participants reported that sensors did not change or impact their work. Our findings suggest the potential for the collection of eye-gaze in authentic environments. © 2022 ACM.
引用
收藏
相关论文
共 50 条
  • [41] Cross-talk elimination for lenslet array near eye display based on eye-gaze tracking
    Ye, Bi
    Fujimoto, Yuichiro
    Uchimine, Yuta
    Sawabe, Taishi
    Kanbara, Masayuki
    Kato, Hirokazu
    [J]. OPTICS EXPRESS, 2022, 30 (10) : 16196 - 16216
  • [42] Feasibility of an eye-gaze technology intervention for students with severe motor and communication difficulties in Taiwan
    Hsieh, Yu-Hsin
    Granlund, Mats
    Hwang, Ai-Wen
    Hemmingsson, Helena
    [J]. AUGMENTATIVE AND ALTERNATIVE COMMUNICATION, 2024, 40 (03) : 196 - 207
  • [43] Eye-Gaze Tracking Based on Head Orientation Estimation Using FMCW Radar Sensor
    Jung, Jaehoon
    Kim, Jihye
    Kim, Seong-Cheol
    Lim, Sohee
    [J]. IEEE Transactions on Instrumentation and Measurement, 2024, 73
  • [44] Improving the Accuracy and Reliability of Remote System-Calibration-Free Eye-Gaze Tracking
    Hennessey, Craig A.
    Lawrence, Peter D.
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2009, 56 (07) : 1891 - 1900
  • [45] Real-time motorized electrical hospital bed control with eye-gaze tracking
    Aydin Atasoy, Nesrin
    Cavusoglu, Abdullah
    Atasoy, Ferhat
    [J]. TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2016, 24 (06) : 5162 - +
  • [46] Eye-gaze Tracking Method Driven by Raspberry PI Applicable in Automotive Traffic Safety
    Stan, Ovidiu
    Miclea, Liviu
    Centea, Ana
    [J]. 2014 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, MODELLING AND SIMULATION, 2014, : 126 - 130
  • [47] Looking Beneath the Surface: The Science and Applications of Eye-Gaze Tracking for Assessing Visual Attention
    Cristina, Stefania
    [J]. PROCEEDINGS OF THE 2023 ACM SYMPOSIUM ON DOCUMENT ENGINEERING, DOCENG 2023, 2023,
  • [48] Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models
    Bottos, Stephen
    Balasingam, Balakumar
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2020, 69 (10) : 7857 - 7868
  • [49] Eye-gaze orienting to auditory and tactile targets
    Soto-Faraco, S
    Kingstone, A
    [J]. JOURNAL OF PSYCHOPHYSIOLOGY, 2005, 19 (01) : 61 - 61
  • [50] A System for Web Browsing by Eye-Gaze Input
    Abe, Kiyohiko
    Owada, Kosuke
    Ohi, Shoichi
    Ohyama, Minoru
    [J]. ELECTRONICS AND COMMUNICATIONS IN JAPAN, 2008, 91 (05) : 11 - 18