Tracking the Progression of Reading Through Eye-gaze Measurements

被引:2
|
作者
Bottos, Stephen [1 ]
Balasingam, Balakumar [1 ]
机构
[1] Univ Windsor, Dept Elect & Comp Engn, 401 Sunset Ave, Windsor, ON N9B 3P4, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
autonomous systems; human-machine automation; human factors; eye-gaze points; hidden Markov models; least squares estimation; Kalman filter; MOVEMENTS;
D O I
10.23919/fusion43075.2019.9011436
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we consider the problem of tracking the progression of reading through eye-gaze measurements. Such an algorithm is novel and will ultimately help to develop a method of analyzing eye-gaze data which had been collected during reading activity in order to uncover crucial information regarding the individual's interest level and quality of experience while reading a passage of text or book. Additionally, such an approach will serve as a "visual signature" a method of verifying if an individual has indeed given adequate attention to critical text based information. Further, an accurate "reading-progression tracker" has potential applications in educational institutions, e-readers and parenting solutions. Tracking the progression of reading remains a challenging problem due to the fact that eye-gaze movements are highly noisy and the eye-gaze is easily distracted in a limited space, like an e-book. In a prior work, we proposed an approach to analyzing eye-gaze fixation points collected while reading a page of text in order to classify each measurement to a line of text; this approach did not consider tracking the progression of reading along the line of text. In this paper, we extend the capabilities of the previous algorithm in order to accurately track the progression of reading along each line. the proposed approach employs least squares batch estimation in order to estimate three states of the horizontal saccade: position, velocity and acceleration. First, the proposed approach is objectively evaluated on a simulated eye-gaze dataset. Then, the proposed algorithm is demonstrated on real data collected by a Gazepoint eye-tracker while the subject is reading several pages from an electronic book.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] RESPONSE BIASES IN EYE-GAZE PERCEPTION
    MARTIN, W
    ROVIRA, M
    [J]. JOURNAL OF PSYCHOLOGY, 1982, 110 (02): : 203 - 209
  • [32] Development of an Eye-Gaze Controlled Interface for Surgical Manipulators Using Eye-Tracking Glasses
    Yip, Hiu Man
    Navarro-Alarcon, David
    Liu, Yun-hui
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2016, : 1900 - 1905
  • [33] Eye-Tracking for Avatar Eye-Gaze and Interactional Analysis in Immersive Collaborative Virtual Environments
    Steptoe, William
    Wolff, Robin
    Murgia, Alessio
    Guimaraes, Estefania
    Rae, John
    Sharkey, Paul
    Roberts, David
    Steed, Anthony
    [J]. CSCW: 2008 ACM CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK, CONFERENCE PROCEEDINGS, 2008, : 197 - +
  • [34] USER-CALIBRATION-FREE REMOTE EYE-GAZE TRACKING SYSTEM WITH EXTENDED TRACKING RANGE
    Model, Dmitri
    Eizenman, Moshe
    [J]. 2011 24TH CANADIAN CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING (CCECE), 2011, : 1268 - 1271
  • [35] A non-contact eye-gaze tracking system for human computer interaction
    Qi, Ying
    Wang, Zhi-Liang
    Huang, Ying
    [J]. 2007 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION, VOLS 1-4, PROCEEDINGS, 2007, : 68 - 72
  • [36] A free geometry model-independent neural eye-gaze tracking system
    Gneo, Massimo
    Schmid, Maurizio
    Conforto, Silvia
    D'Alessio, Tommaso
    [J]. JOURNAL OF NEUROENGINEERING AND REHABILITATION, 2012, 9
  • [37] A free geometry model-independent neural eye-gaze tracking system
    Massimo Gneo
    Maurizio Schmid
    Silvia Conforto
    Tommaso D’Alessio
    [J]. Journal of NeuroEngineering and Rehabilitation, 9
  • [38] An affective user interface based on facial expression recognition and eye-gaze tracking
    Choi, SM
    Kim, YG
    [J]. AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PROCEEDINGS, 2005, 3784 : 907 - 914
  • [39] Effectiveness of Eye-Gaze Input Method: Comparison of Speed and Accuracy Among Three Eye-Gaze Input Method
    Murata, Atsuo
    Moriwaka, Makoto
    [J]. ADVANCES IN USABILITY, USER EXPERIENCE AND ASSISTIVE TECHNOLOGY, 2019, 794 : 763 - 772
  • [40] Binocular Vision Impairments Therapy Supported by Contactless Eye-Gaze Tracking System
    Kosikowski, Lukasz
    Czyzewski, Andrzej
    [J]. COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, PROCEEDINGS, PT 2, 2010, 6180 : 373 - 376