Assessing hands-free interactions for VR using eye gaze and electromyography

被引:55
|
作者
Pai, Yun Suen [1 ]
Dingler, Tilman [2 ]
Kunze, Kai [1 ]
机构
[1] Keio Univ, Grad Sch Media Design, Yokohama, Kanagawa, Japan
[2] Univ Melbourne, Melbourne, Vic, Australia
关键词
Virtual reality; Physiological sensing; Eye gaze; Electromyography; TRACKING; INPUT;
D O I
10.1007/s10055-018-0371-2
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
With the increasing popularity of virtual reality (VR) technologies, more efforts have been going into developing new input methods. While physical controllers are widely used, more novel techniques, such as eye tracking, are now commercially available. In our work, we investigate the use of physiological signals as input to enhance VR experiences. We present a system using gaze tracking and electromyography on a user's forearm to make selection tasks in virtual spaces more efficient. In a study with 16 participants, we compared five different input techniques using a Fitts' law task: Using gaze tracking for cursor movement in combination with forearm contractions for making selections was superior to using an HTC Vive controller, Xbox gamepad, dwelling time, and eye-gaze dwelling time. To explore application scenarios and collect qualitative feedback, we further developed and evaluated a game with our input technique. Our findings inform the design of applications that use eye-gaze tracking and forearm muscle movements for effective user input in VR.
引用
收藏
页码:119 / 131
页数:13
相关论文
共 50 条
  • [1] Assessing hands-free interactions for VR using eye gaze and electromyography
    Yun Suen Pai
    Tilman Dingler
    Kai Kunze
    [J]. Virtual Reality, 2019, 23 : 119 - 131
  • [2] Integration of electromyogram and eye gaze tracking inputs for hands-free cursor control
    Chin, Craig A.
    Barreto, Armando
    [J]. Biomedical Sciences Instrumentation, Vol 43, 2007, 43 : 152 - 157
  • [3] Fast hands-free writing by gaze direction
    David J. Ward
    David J. C. MacKay
    [J]. Nature, 2002, 418 : 838 - 838
  • [4] EyeSayCorrect: Eye Gaze and Voice Based Hands-free Text Correction for Mobile Devices
    Zhao, Maozheng
    Huang, Henry
    Li, Zhi
    Liu, Rui
    Cui, Wenzhe
    Toshniwal, Kajal
    Goel, Ananya
    Wang, Andrew
    Zhao, Xia
    Rashidian, Sina
    Baig, Furqan
    Phi, Khiem
    Zhai, Shumin
    Ramakrishnan, I. V.
    Wang, Fusheng
    Bi, Xiaojun
    [J]. IUI'22: 27TH INTERNATIONAL CONFERENCE ON INTELLIGENT USER INTERFACES, 2022, : 470 - 482
  • [5] Investigating the Usability of Collaborative Robot control through Hands-Free Operation using Eye gaze and Augmented Reality
    Lee, Joosun
    Lim, Taeyhang
    Kim, Wansoo
    [J]. 2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 4101 - 4106
  • [6] Initial Evaluation of a Hybrid Eye Tracking and Electromyography Training Game for Hands-Free Wheelchair Use
    Smith, Peter
    Dombrowski, Matt
    MacDonald, Calvin
    Williams, Courtney
    Pradeep, Maanya
    Barnum, Elizabeth
    Rivera, Viviana
    Sparkman, John
    Manero, Albert
    [J]. PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [7] EVEN HANDS-FREE INTERACTIONS ARE DISTRACTING
    不详
    [J]. MECHANICAL ENGINEERING, 2015, 137 (12) : 11 - 11
  • [8] Accessible Hands-Free Input Methods for VR Games
    Heilemann, Fiona
    Zimmermann, Gottfried
    Muenster, Patrick
    [J]. COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, ICCHP-AAATE 2022, 2022, : 371 - 379
  • [9] Hands-free navigation in VR environments by tracking the head
    Kang, SB
    [J]. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 1998, 48 (02) : 247 - 266
  • [10] Hands-free writing is made easy on the eye
    不详
    [J]. PHYSICS WORLD, 2002, 15 (10) : 3 - 3