Ultra-Low Power Gaze Tracking for Virtual Reality

被引:11
|
作者
Li, Tianxing [1 ]
Liu, Qiang [1 ]
Zhou, Xia [1 ]
机构
[1] Dartmouth Coll, Dept Comp Sci, Hanover, NH 03755 USA
基金
美国国家科学基金会;
关键词
Gaze tracking; virtual reality; visible light sensing; EYE-MOVEMENT; TIME EYE; MODEL;
D O I
10.1145/3131672.3131682
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3. and 10.1. mean within-user and cross-user accuracy. Its sensing and computation consume 791 mu W in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Akosah, Emmanuel S.
    Liu, Qiang
    Zhou, Xia
    [J]. PROCEEDINGS OF THE 23RD ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING (MOBICOM '17), 2017, : 490 - 492
  • [2] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Akosah, Emmanuel S.
    Liu, Qiang
    Zhou, Xia
    [J]. PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17), 2017,
  • [3] ULTRA-LOW-POWER GAZE TRACKING FOR VIRTUAL REALITY
    Li, Tianxing
    Liu, Qiang
    Zhou, Xia
    [J]. GETMOBILE-MOBILE COMPUTING & COMMUNICATIONS REVIEW, 2018, 22 (03) : 27 - 31
  • [4] Virtual reality technology of ultra-low altitude UAV
    Yuan, Yanwei
    Zhang, Xiaochao
    Mao, Wenhua
    Zhao, Huaping
    [J]. Nongye Jixie Xuebao/Transactions of the Chinese Society of Agricultural Machinery, 2009, 40 (06): : 147 - 152
  • [5] An Ultra-low Power Automated Maximum Power Point Tracking Circuit with 99.9% Tracking Efficiency
    Abedi, Mostafa
    Shrivastava, Aatmesh
    [J]. 2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [6] Ultra-Low Power Transmitter
    Ghasempour, Mohsen
    Shang, Delong
    Xia, Fei
    Yakovlev, Alex
    [J]. 2012 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 2012), 2012, : 1807 - 1810
  • [7] Ultra-low power wearables
    Parameshachari, B.D.
    Rocha, Álvaro
    Fung, Chun Che Lance
    [J]. Personal and Ubiquitous Computing, 2023, 27 (03) : 1257 - 1259
  • [8] Autonomous Maximum Power Point Tracking Algorithm for Ultra-Low Power Energy Harvesting
    Steffan, Christoph
    Greiner, Philipp
    Kollegger, Carolin
    Siegl, Inge
    Holweg, Gerald
    Deutschmann, Bernd
    [J]. 2017 IEEE 60TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2017, : 1372 - 1375
  • [9] Ultra-Low Power and Ultra-Low Voltage Devices and Circuits for IoT Applications
    Hiramoto, T.
    Takeuchi, K.
    Mizutani, T.
    Ueda, A.
    Saraya, T.
    Kobayashi, M.
    Yamamoto, Y.
    Makiyama, H.
    Yamashita, T.
    Oda, H.
    Kamohara, S.
    Sugii, N.
    Yamaguchi, Y.
    [J]. 2016 IEEE SILICON NANOELECTRONICS WORKSHOP (SNW), 2016, : 146 - 147
  • [10] Robust gaze tracking method for stereoscopic virtual reality systems
    Lee, Eui Chul
    Park, Kang Ryoung
    Whang, Min Cheol
    Park, Junseok
    [J]. HUMAN-COMPUTER INTERACTION, PT 3, PROCEEDINGS, 2007, 4552 : 700 - +