Exploring Gaze-Assisted and Hand-based Region Selection in Augmented Reality

被引:0
|
作者
Shi R. [1 ]
Wei Y. [1 ]
Qin X. [2 ]
Hui P. [3 ]
Liang H.-N. [1 ]
机构
[1] Xi an Jiaotong-Liverpool University, Suzhou
[2] Shandong University, Jinan
[3] The Hong Kong University of Science and Technology (Guangzhou), Guangzhou
关键词
augmented reality; eye-Tracking; gaze interaction; head-mounted display; multimodal interaction; region selection;
D O I
10.1145/3591129
中图分类号
学科分类号
摘要
Region selection is a fundamental task in interactive systems. In 2D user interfaces, users typically use a rectangle selection tool to formulate a region using a mouse or touchpad. Region selection in 3D spaces, especially in Augmented Reality (AR) Head-Mounted Displays (HMDs) is different and challenging because users need to select an intended region via freehand mid-Air gestures or eye-based actions that are touchless interactions. In this work, we aim to fill in the gap in the design of region selection techniques in AR HMDs. We first analyzed and discretized the interaction procedure of region selection and explored design possibilities for each step. We then developed four techniques for region selection in AR HMDs, which leveraged users' hand and gaze for unimodal or multimodal interaction. The techniques were evaluated via a user study with a controlled region selection task. The findings led to three design recommendations and two proof-of-concept application examples. © 2023 ACM.
引用
收藏
相关论文
共 50 条
  • [1] Hand-based interface for augmented reality
    Toledo-Moreo, F. Javier
    Martinez-Alvarez, J. Javier
    Ferrandez-Vicente, J. Manuel
    FCCM 2007: 15TH ANNUAL IEEE SYMPOSIUM ON FIELD-PROGRAMMABLE CUSTOM COMPUTING MACHINES, PROCEEDINGS, 2007, : 291 - +
  • [2] Hand-based interaction in augmented reality
    McDonald, C
    Malik, S
    Roth, G
    HAVE 2002 - IEEE INTERNATIONAL WORKSHOP ON HAPTIC VIRTUAL ENVIRONMENTS AND THEIR APPLICATIONS, 2002, : 55 - 59
  • [3] Outline Pursuits: Gaze-assisted Selection of Occluded Objects in Virtual Reality
    Sidenmark, Ludwig
    Clarke, Christopher
    Zhang, Xuesong
    Phu, Jenny
    Gellersen, Hans
    PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'20), 2020,
  • [4] Exploring Audio Interfaces for Vertical Guidance in Augmented Reality via Hand-Based Feedback
    Guarese, Renan
    Pretty, Emma
    Renata, Aidan
    Polson, Deb
    Zambetta, Fabio
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2024, 30 (05) : 2818 - 2828
  • [5] Gaze Augmented Hand-Based Kinesthetic Interaction: What You See is What You Feel
    Li, Zhenxing
    Akkil, Deepak
    Raisamo, Roope
    IEEE TRANSACTIONS ON HAPTICS, 2019, 12 (02) : 114 - 127
  • [6] GazeHelp: Exploring Practical Gaze-assisted Interactions for Graphic Design Tools
    Lewien, Ryan
    ACM SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS (ETRA 2021), 2021,
  • [7] Survey on Hand-Based Haptic Interaction for Virtual Reality
    Tong, Qianqian
    Wei, Wenxuan
    Zhang, Yuru
    Xiao, Jing
    Wang, Dangxiao
    IEEE TRANSACTIONS ON HAPTICS, 2023, 16 (02) : 154 - 170
  • [8] A Hand-based Collaboration Framework in Egocentric Coexistence Reality
    Yui, Jeongmin
    Noh, Seungtak
    Jang, Youngkyoon
    Park, Gabyong
    Woos, Woontack
    2015 12TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2015, : 545 - 548
  • [9] GAVIN: Gaze-Assisted Voice-Based Implicit Note-taking
    Khan, Anam Ahmad
    Newn, Joshua
    Kelly, Ryan M.
    Srivastava, Namrata
    Bailey, James
    Velloso, Eduardo
    ACM TRANSACTIONS ON COMPUTER-HUMAN INTERACTION, 2021, 28 (04)
  • [10] Vergence Matching: Inferring Atention to Objects in 3D Environments for Gaze-Assisted Selection
    Sidenmark, Ludwig
    Clarke, Christopher
    Newn, Joshua
    Lystbaek, Mathias N.
    Pfeufer, Ken
    Gellersen, Hans
    PROCEEDINGS OF THE 2023 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2023, 2023,