Eye gaze in virtual environments: evaluating the need and initial work on implementation

被引:5
|
作者
Murray, Norman [1 ]
Roberts, Dave [1 ]
Steed, Anthony [2 ]
Sharkey, Paul [3 ]
Dickerson, Paul [4 ]
Rae, John [4 ]
Wolff, Robin [1 ]
机构
[1] Univ Salford, Ctr Virtual Environm, Salford M5 4WT, Lancs, England
[2] UCL, Dept Comp Sci, London WC1E 6BT, England
[3] Univ Reading, Dept Cybernet, Interact Syst Res Grp, Reading RG6 6UR, Berks, England
[4] Roehampton Univ, Sch Human & Life Sci, London SW15 5PU, England
来源
基金
英国工程与自然科学研究理事会;
关键词
Virtual Environment; eye tracking; evaluation;
D O I
10.1002/cpe.1396
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright (C) 2009 John Wiley & Sons, Ltd.
引用
下载
收藏
页码:1437 / 1449
页数:13
相关论文
共 50 条
  • [31] Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments
    Kwok, Tiffany C. K.
    Kiefer, Peter
    Schinazi, Victor R.
    Adams, Benjamin
    Raubal, Martin
    CHI 2019: PROCEEDINGS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2019,
  • [32] Evaluating Mobile Applications in Virtual Environments: A Survey
    Delikostidis, Ioannis
    Fechner, Thore
    Fritze, Holger
    AbdelMouty, Ahmed Mahmoud
    Kray, Christian
    INTERNATIONAL JOURNAL OF MOBILE HUMAN COMPUTER INTERACTION, 2013, 5 (04) : 1 - 19
  • [33] A consistency model for evaluating distributed virtual environments
    Zhou, SP
    Cai, WT
    Turner, SJ
    Zhao, HF
    2003 INTERNATIONAL CONFERENCE ON CYBERWORLDS, PROCEEDINGS, 2003, : 85 - 91
  • [34] Evaluating NeRF Fidelity using Virtual Environments
    Talley, Ander
    Jones, J. Adam
    2024 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS, VRW 2024, 2024, : 1027 - 1028
  • [35] Evaluating Embodied Navigation in Virtual Reality Environments
    DeYoung, Johannes
    Berry, Justin
    Riggs, Stephanie
    Wesson, Jack
    Chantiles-Wertz, Lance
    2018 IEEE GAMES, ENTERTAINMENT, MEDIA CONFERENCE (GEM), 2018, : 338 - 343
  • [36] Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments
    Celiktutan, Oya
    Demiris, Yiannis
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT VI, 2019, 11134 : 193 - 209
  • [37] Using Eye Gaze to Enhance Generalization of Imitation Networks to Unseen Environments
    Liu, Congcong
    Chen, Yuying
    Liu, Ming
    Shi, Bertram E.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (05) : 2066 - 2074
  • [38] Motion Control of a Powered Wheelchair Using Eye Gaze in Unknown Environments
    Ishizuka, Airi
    Yorozu, Ayanori
    Takahashi, Masaki
    2017 11TH ASIAN CONTROL CONFERENCE (ASCC), 2017, : 90 - 95
  • [39] Implementation of virtual work principle in virtual air gap
    Choi, Hong Soon
    Lee, Se Hee
    Kim, Young Sun
    Kim, Kwang Tae
    Park, Il Han
    IEEE TRANSACTIONS ON MAGNETICS, 2008, 44 (06) : 1286 - 1289
  • [40] Bayesian Logistic Models for Predicting Initial Eye Gaze and Subsequent Choice
    Lages, Martin
    Scheel, Anne
    PERCEPTION, 2017, 46 (02) : 229 - 230