3D Gaze Estimation for Head-Mounted Eye Tracking System With Auto-Calibration Method

被引:20
|
作者
Liu, Meng [1 ]
Li, Youfu [1 ]
Liu, Hai [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Mech Engn, Hong Kong, Peoples R China
[2] Cent China Normal Univ, Natl Engn Res Ctr E Learning, Wuhan 430079, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Head-mounted gaze tracking system; saliency maps; auto-calibration; 3D gaze estimation; CALIBRATION;
D O I
10.1109/ACCESS.2020.2999633
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The general challenges of 3D gaze estimation for head-mounted eye tracking systems are inflexible marker-based calibration procedure and significant errors of depth estimation. In this paper, we propose a 3D gaze estimation with an auto-calibration method. To acquire the accurate 3D structure of the environment, an RGBD camera is applied as the scene camera of our system. By adopting the saliency detection method, saliency maps can be acquired through scene images, and 3D salient pixels in the scene are considered potential 3D calibration targets. The 3D eye model is built on the basis of eye images to determine gaze vectors. By combining 3D salient pixels and gaze vectors, the auto-calibration can be achieved with our calibration method. Finally, the 3D gaze point is obtained through the calibrated gaze vectors, and the point cloud is generated from the RGBD camera. The experimental result shows that the proposed system can achieve an average accuracy of 3.7 degrees in the range of 1 m to 4 m indoors and 4.0 degrees outdoors. The proposed system also presents a great improvement in depth measurement, which is sufficient for tracking users' visual attention in real scenes.
引用
收藏
页码:104207 / 104215
页数:9
相关论文
共 50 条
  • [1] 3D Auto-Calibration Method for Head-Mounted Binocular Gaze Tracker as Human-Robot Interface
    Kwon, Su Hyun
    Kim, Min Young
    [J]. PROCEEDINGS OF THE 8TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2013), 2013, : 171 - 172
  • [2] 3D gaze estimation in the scene volume with a head-mounted eye tracker
    Elmadjian, Carlos
    Shukla, Pushkar
    Tula, Antonio Diaz
    Morimoto, Carlos H.
    [J]. COMMUNICATION BY GAZE INTERACTION (COGAIN 2018), 2018,
  • [3] High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems
    Xia, Yang
    Liang, Jiejunyi
    Li, Quanlin
    Xin, Peiyang
    Zhang, Ning
    [J]. SENSORS, 2022, 22 (12)
  • [4] A Monocular Reflection-Free Head-Mounted 3D Eye Tracking System
    Cao, Shihao
    Zhao, Xinbo
    Qin, Beibei
    Li, Junjie
    Xiang, Zheng
    [J]. IMAGE AND GRAPHICS (ICIG 2021), PT III, 2021, 12890 : 659 - 672
  • [5] TWO-PHASE APPROACH - CALIBRATION AND IRIS CONTOUR ESTIMATION - FOR GAZE TRACKING OF HEAD-MOUNTED EYE CAMERA
    Li, Jianfeng
    Li, Shigang
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 3136 - 3140
  • [6] 3D Gaze Estimation for Head-Mounted Devices based on Visual Saliency
    Liu, Meng
    Li, You Fu
    Liu, Hai
    [J]. 2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 10611 - 10616
  • [7] Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles
    Anca Velisar
    Natela M. Shanidze
    [J]. Behavior Research Methods, 2024, 56 : 53 - 79
  • [8] 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers
    Mansouryar, Mohsen
    Steil, Julian
    Sugano, Yusuke
    Bulling, Andreas
    [J]. 2016 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2016), 2016, : 197 - 200
  • [9] Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles
    Velisar, Anca
    Shanidze, Natela M.
    [J]. BEHAVIOR RESEARCH METHODS, 2024, 56 (01) : 53 - 79
  • [10] Toward Precise Gaze Estimation for Mobile Head-Mounted Gaze Tracking Systems
    Su, Dan
    Li, You-Fu
    Chen, Hao
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2019, 15 (05) : 2660 - 2672