Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

被引:11
|
作者
Drakopoulos, Panagiotis [1 ]
Koulieris, George-alex [2 ]
Mania, Katerina [1 ]
机构
[1] Tech Univ Crete, Sch Elect & Comp Engn, Univ Campus Kounoupidiana, Khania 73100, Greece
[2] Univ Durham, Math Sci & Comp Sci Bldg, Upper Mountjoy Campus,Stockton Rd, Durham DH1 3LE, England
关键词
Mobile VR; eye tracking; ROBUST PUPIL DETECTION; GAZE;
D O I
10.1145/3456875
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset's lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system's accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset's field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Adaptive Trajectory Tracking of Wheeled Mobile Robots Based on a Fish-eye Camera
    Kang, Zhaobing
    Zou, Wei
    Ma, Hongxuan
    Zhu, Zheng
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2019, 17 (09) : 2297 - 2309
  • [22] Adaptive Trajectory Tracking of Wheeled Mobile Robots Based on a Fish-eye Camera
    Zhaobing Kang
    Wei Zou
    Hongxuan Ma
    Zheng Zhu
    International Journal of Control, Automation and Systems, 2019, 17 : 2297 - 2309
  • [23] Eye detection and tracking using rectangle features and integrated eye tracker by Web camera
    Xu, Wenkai
    Lee, Eung-Joo
    International Journal of Multimedia and Ubiquitous Engineering, 2013, 8 (04): : 25 - 34
  • [24] Moving Object Tracking with Mobile Robot Using Camera and Laser
    Wu, Ming
    Li, Linlin
    Wei, Zhenhua
    Li, Chengjian
    Wang, Hongqiao
    MATERIAL SCIENCE, CIVIL ENGINEERING AND ARCHITECTURE SCIENCE, MECHANICAL ENGINEERING AND MANUFACTURING TECHNOLOGY II, 2014, 651-653 : 776 - 779
  • [25] A preliminary study using a web camera based eye tracking to assess novelty reaction allowing user interaction
    Beltran, Jessica
    Rios-Vazquez, Isaac
    Sanchez-Cortez, Ambar S.
    Navarro, Rene F.
    Maldonado-Cano, Luis A.
    Garcia-Vazquez, Mireya S.
    MEXIHC 2018: PROCEEDINGS OF THE 7TH MEXICAN CONFERENCE ON HUMAN-COMPUTER INTERACTION, 2018,
  • [26] Mobile Camera Based Cross-Screen Interaction by Object Matching and Tracking
    Hu, Wei
    Lian, Shiguo
    Song, Xingguang
    Li, Teng
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2013, 59 (03) : 452 - 459
  • [27] Eyemotion: Classifying facial expressions in VR using eye-tracking cameras
    Hickson, Steven
    Dufour, Nick
    Sud, Avneesh
    Kwatra, Vivek
    Essa, Irfan
    2019 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2019, : 1626 - 1635
  • [28] Detection of Relative Afferent Pupillary Defects Using Eye Tracking and a VR Headset
    Bruegger, Dominik
    Grabe, Hilary M.
    Vicini, Rino
    Dysli, Muriel
    Lussi, David
    Abegg, Mathias
    TRANSLATIONAL VISION SCIENCE & TECHNOLOGY, 2023, 12 (06):
  • [29] Eye Gaze Tracking Using an RGBD Camera: A Comparison with a RGB Solution
    Xiong, Xuehan
    Liu, Zicheng
    Cai, Qin
    Zhang, Zhengyou
    PROCEEDINGS OF THE 2014 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING (UBICOMP'14 ADJUNCT), 2014, : 1113 - 1121
  • [30] Using Eye Tracking as Human Computer Interaction Interface
    Schmidt, Holger
    Zimmermann, Gottfried
    HCI INTERNATIONAL 2015 - POSTERS' EXTENDED ABSTRACTS, PT I, 2015, 528 : 523 - 527