A Novel Approach To Video-Based Pupil Tracking

被引:26
|
作者
Kumar, Nishant [1 ]
Kohlbecher, Stefan [2 ]
Schneider, Erich [2 ]
机构
[1] Indian Inst Technol, Dept Mech Engn, Mumbai, Maharashtra, India
[2] Univ Munich Hosp, Clin Neurosci, Munich, Germany
关键词
mean-luminance filter; fast radial symmetry detector; edge detection; delaunay triangulation; luminance contrast filter; sub-pixel ellipse fitting; VESTIBULOOCULAR REFLEX; EYE-MOVEMENTS; SCENE;
D O I
10.1109/ICSMC.2009.5345909
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
EyeSeeCam is a novel head mounted camera that is continuously oriented to the user's point of regard by the eye movement signals of a mobile video-based eye tracking device. We have devised a new eye tracking algorithm for EyeSeeCam which has low computational complexity and lends enough robustness in the detection of pupil centre. Accurate determination of the location of the centre of the pupil and processing speed are the most crucial requirements in such a real-time video-based eye-tracking system. However, occlusion of the pupil by artifacts such as eyelids, eyelashes, glints and shadows in the image of the eye and changes in the illumination conditions pose significant problems in the determination of pupil centre. Apart from robustness and accuracy, real-time eye-tracking applications demand low computational complexity as well. In our algorithm, the Fast Radial Symmetry Detector is used to give a rough estimate of the location of the pupil. An edge operator is used to produce the edge image. Unwanted artifacts are deleted in a series of logical steps. Then, Delaunay Triangulation is used to extract the pupil boundary from the edge image, based on the fact that the pupil is a convex hull. A luminance contrast filter is used to obtain an ellipse fit at the sub-pixel level. The ellipse fitting function is based on a non iterative least squares minimization approach. The pupil boundary was detected accurately in 96% of the cases, including those in which the pupil was occluded by more than half its size. The proposed algorithm is also robust against drastic changes in the environment, i.e., eye tracking in a closed room versus eye tracking in sunlight.
引用
收藏
页码:1255 / +
页数:3
相关论文
共 50 条
  • [41] A nonvisual eye tracker calibration method for video-based tracking
    Harrar, Vanessa
    Le Trung, William
    Malienko, Anton
    Khan, Aarlenne Zein
    [J]. JOURNAL OF VISION, 2018, 18 (09): : 1 - 11
  • [42] Video-based vehicle tracking and capturing system for expressway tollgates
    [J]. Xia, C. (xiacw@foxmail.com), 1600, Science Press (48):
  • [43] VIDEO-BASED FACE RECOGNITION AND TRACKING FROM A ROBOT COMPANION
    Germa, T.
    Lerasle, F.
    Simon, T.
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2009, 23 (03) : 591 - 616
  • [44] A Video-Based Facial Motion Tracking and Expression Recognition System
    Jun Yu
    Zengfu Wang
    [J]. Multimedia Tools and Applications, 2017, 76 : 14653 - 14672
  • [45] The design and implementation of video-based interactive motion tracking model
    Pan, Jingchang
    Gu, Qiang
    [J]. 2008 INTERNATIONAL CONFERENCE ON AUDIO, LANGUAGE AND IMAGE PROCESSING, VOLS 1 AND 2, PROCEEDINGS, 2008, : 1565 - 1568
  • [46] Video-Based Fruit Detection and Tracking for Apple Counting and Mapping
    Gene-Mola, Jordi
    Felip-Pomes, Marc
    Net-Barnes, Francesc
    Morros, Josep-Ramon
    Miranda, Juan C.
    Arno, Jaume
    Asin, Luis
    Lordan, Jaume
    Ruiz-Hidalgo, Javier
    Gregorio, Eduard
    [J]. PROCEEDINGS OF 2023 IEEE INTERNATIONAL WORKSHOP ON METROLOGY FOR AGRICULTURE AND FORESTRY, METROAGRIFOR, 2023, : 301 - 306
  • [47] Using structured illumination to enhance video-based eye tracking
    Li, Feng
    Kolakowski, Susan
    Pelz, Jeff
    [J]. 2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7, 2007, : 373 - 376
  • [48] Unobtrusive and pervasive video-based eye-gaze tracking
    Cristina, Stefania
    Camilleri, Kenneth P.
    [J]. IMAGE AND VISION COMPUTING, 2018, 74 : 21 - 40
  • [49] Video-Based Vehicle Detection and Tracking Using Spatiotemporal Maps
    Malinovskiy, Yegor
    Wu, Yao-Jan
    Wang, Yinhai
    [J]. TRANSPORTATION RESEARCH RECORD, 2009, (2121) : 81 - 89
  • [50] Statistical Methods and Models for Video-Based Tracking, Modeling, and Recognition
    Chellappa, Rama
    Sankaranarayanan, Aswin C.
    Veeraraghavan, Ashok
    Turaga, Pavan
    [J]. FOUNDATIONS AND TRENDS IN SIGNAL PROCESSING, 2009, 3 (1-2): : 1 - 151