Perceptual self-position estimation based on gaze tracking in virtual reality

被引:2
|
作者
Liu, Hongmei [1 ]
Qin, Huabiao [1 ]
机构
[1] South China Univ Technol, Sch Elect & Informat Engn, Guangzhou, Guangdong, Peoples R China
关键词
Gaze tracking; Depth perception; Stereo vision; Human-computer interaction; Visual discomfort; HEAD-MOUNTED DISPLAYS; DEPTH-PERCEPTION; DISTANCE; ENVIRONMENTS; PERFORMANCE;
D O I
10.1007/s10055-021-00553-y
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The depth perception of human visual system is divergent between virtual and real space; this depth discrepancy affects the spatial judgment of the user in a virtual space, which means the user cannot precisely locate their self-position in a virtual space. Existing localization methods ignore the depth discrepancy and only concentrate on increasing location accuracy in real space. Thus, the discrepancy always exists in virtual space, which induces visual discomfort. In this paper, a localization method based on depth perception is proposed to measure the self-position of the user in a virtual environment. Using binocular gaze tracking, this method estimates perceived depth and constructs an eye matrix by measuring gaze convergence on a target. Comparing the eye matrix and camera matrix, the method can automatically calculate the actual depth of the viewed target. Then, the difference between the actual depth and the perceived depth can be explicitly estimated without markers. The position of the virtual camera is compensated by the depth difference to obtain perceptual self-position. Furthermore, a virtual reality system is redesigned by adjusting the virtual camera position. The redesigned system makes users feel that the distance (from the user to an object) is the same in virtual and real space. Experimental results demonstrate that the redesigned system can improve the user's visual experiences, which validate the superiority of the proposed localization method.
引用
收藏
页码:269 / 278
页数:10
相关论文
共 50 条
  • [41] SalientGaze: Saliency-based gaze correction in virtual reality
    Shi, Peiteng
    Billeter, Markus
    Eisemann, Elmar
    COMPUTERS & GRAPHICS-UK, 2020, 91 (83-94): : 83 - 94
  • [42] Gaze-Based Interaction Intention Recognition in Virtual Reality
    Chen, Xiao-Lin
    Hou, Wen-Jun
    ELECTRONICS, 2022, 11 (10)
  • [43] Gaze estimation using EEG signals for HCI in augmented and virtual reality headsets
    Montenegro, Juan Manuel Fernandez
    Argyriou, Vasileios
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 1159 - 1164
  • [44] Precision position tracking in virtual reality environments using sensor networks
    Gulez, Tauseef
    Kavakli, Manolya
    2007 IEEE INTERNATIONAL SYMPOSIUM ON INDUSTRIAL ELECTRONICS, PROCEEDINGS, VOLS 1-8, 2007, : 1997 - 2003
  • [45] Tracking gaze position from EEG: Exploring the possibility of an EEG-based virtual eye-tracker
    Sun, Rui
    Cheng, Andy S. K.
    Chan, Cynthia
    Hsiao, Janet
    Privitera, Adam J.
    Gao, Junling
    Fong, Ching-hang
    Ding, Ruoxi
    Tang, Akaysha C.
    BRAIN AND BEHAVIOR, 2023, 13 (10):
  • [46] Gaze Behavior in Social Fear Conditioning: An Eye-Tracking Study in Virtual Reality
    Reichenberger, Jonas
    Pfaller, Michael
    Muehlberger, Andreas
    FRONTIERS IN PSYCHOLOGY, 2020, 11
  • [47] Self-Position Awareness Based on Cascade Direct Localization Over Multiple Source Data
    Li, Jianfeng
    Li, Pan
    Li, Ping
    Tang, Leiming
    Zhang, Xiaofei
    Wu, Qihui
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (01) : 796 - 804
  • [48] Self-Position Determination Based on Array Signal Subspace Fitting under Multipath Environments
    Cao, Zhongkang
    Li, Pan
    Tang, Wanghao
    Li, Jianfeng
    Zhang, Xiaofei
    SENSORS, 2023, 23 (23)
  • [49] Depth-Based Subtle Gaze Guidance in Virtual Reality Environments
    Sridharan, Srinivas
    Pieszala, James
    Bailey, Reynold
    SAP 2015: ACM SIGGRAPH SYMPOSIUM ON APPLIED PERCEPTION, 2016, : 132 - 132
  • [50] Gaze-based attention network analysis in a virtual reality classroom
    Stark, Philipp
    Hasenbein, Lisa
    Kasneci, Enkelejda
    Goellner, Richard
    METHODSX, 2024, 12