SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR

被引:0
|
作者
Cho, Hyunsung [1 ,3 ]
Sendhilnathan, Naveen [1 ]
Nebeling, Michael [1 ,4 ]
Wang, Tianyi [1 ]
Padmanabhan, Purnima [2 ]
Browder, Jonathan [1 ]
Lindlbauer, David [3 ]
Jonker, Tanya [1 ]
Todi, Kashyap [1 ]
机构
[1] Meta Inc, Real Labs Res, Redmond, WA 98052 USA
[2] Meta Inc, Real Labs Res, Burlingame, CA USA
[3] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[4] Univ Michigan, Ann Arbor, MI USA
关键词
Extended Reality; Sonification; Haptics; Multimodal Feedback; Computational Interaction; Gaze-based Selection; PITCH;
D O I
10.1145/3654777.3676384
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce SonoHaptics, an audio-haptic cursor for gaze-based 3D object selection. SonoHaptics addresses challenges around providing accurate visual feedback during gaze-based selection in Extended Reality (XR), e. g., lack of world-locked displays in no- or limited-display smart glasses and visual inconsistencies. To enable users to distinguish objects without visual feedback, SonoHaptics employs the concept of cross-modal correspondence in human perception to map visual features of objects (color, size, position, material) to audio-haptic properties (pitch, amplitude, direction, timbre). We contribute data-driven models for determining cross-modal mappings of visual features to audio and haptic features, and a computational approach to automatically generate audio-haptic feedback for objects in the user's environment. SonoHaptics provides global feedback that is unique to each object in the scene, and local feedback to amplify differences between nearby objects. Our comparative evaluation shows that SonoHaptics enables accurate object identification and selection in a cluttered scene without visual feedback.
引用
收藏
页数:19
相关论文
共 35 条
  • [21] OrthoGaze: Gaze-based three-dimensional object manipulation using orthogonal planes
    Liu, Chang
    Plopski, Alexander
    Orlosky, Jason
    COMPUTERS & GRAPHICS-UK, 2020, 89 : 1 - 10
  • [22] Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?
    Spakov, Oleg
    Venesvirta, Hanna
    Lylykangas, Jani
    Farooq, Ahmed
    Raisamo, Roope
    Surakka, Veikko
    DESIGN, USER EXPERIENCE, AND USABILITY, DUXU 2023, PT III, 2023, 14032 : 333 - 352
  • [23] Deep Spatio-Temporal Modeling for Object-Level Gaze-Based Relevance Assessment
    Stavridis, Konstantinos
    Psaltis, Athanasios
    Dimou, Anastasios
    Papadopoulos, Georgios Th
    Daras, Petros
    2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [24] Predicting Gaze-based Target Selection in Augmented Reality Headsets based on Eye and Head Endpoint Distributions
    Wei, Yushi
    Shi, Rongkai
    Yu, Difeng
    Wang, Yihong
    Li, Yue
    Yu, Lingyun
    Liang, Hai-Ning
    PROCEEDINGS OF THE 2023 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2023, 2023,
  • [25] Optimal Gaze-Based Robot Selection in Multi-Human Multi-Robot Interaction
    Zhang, Lingkang
    Vaughan, Richard
    ELEVENTH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN ROBOT INTERACTION (HRI'16), 2016, : 645 - 646
  • [26] Command Selection in Gaze-based See-through Virtual Image-Guided Environments
    Afkari, Hoorieh
    Perez, David Gil de Gomez
    Bednarik, Roman
    AUGMENTED HUMAN 2018: PROCEEDINGS OF THE 9TH AUGMENTED HUMAN INTERNATIONAL CONFERENCE, 2018,
  • [27] Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection
    Chen, Zhaokang
    Shi, Bertram E.
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2019, 35 (03) : 240 - 255
  • [28] An Explanation of Fitts' Law-like Performance in Gaze-Based Selection Tasks Using a Psychophysics Approach
    Schuetz, Immo
    Murdison, T. Scott
    MacKenzie, Kevin J.
    Zannoli, Marina
    CHI 2019: PROCEEDINGS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2019,
  • [29] EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques
    Parisay, Mohsen
    Poullis, Charalambos
    Kersten-Oertel, Marta
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2021, 154
  • [30] Keep Your Eyes on the Target: Enhancing Immersion and Usability by Designing Natural Object Throwing with Gaze-based Targeting
    Lee, Jaeyoon
    Kim, Hanseob
    Kim, Gerard Jounghyun
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,