SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR

被引:0
|
作者
Cho, Hyunsung [1 ,3 ]
Sendhilnathan, Naveen [1 ]
Nebeling, Michael [1 ,4 ]
Wang, Tianyi [1 ]
Padmanabhan, Purnima [2 ]
Browder, Jonathan [1 ]
Lindlbauer, David [3 ]
Jonker, Tanya [1 ]
Todi, Kashyap [1 ]
机构
[1] Meta Inc, Real Labs Res, Redmond, WA 98052 USA
[2] Meta Inc, Real Labs Res, Burlingame, CA USA
[3] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[4] Univ Michigan, Ann Arbor, MI USA
关键词
Extended Reality; Sonification; Haptics; Multimodal Feedback; Computational Interaction; Gaze-based Selection; PITCH;
D O I
10.1145/3654777.3676384
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce SonoHaptics, an audio-haptic cursor for gaze-based 3D object selection. SonoHaptics addresses challenges around providing accurate visual feedback during gaze-based selection in Extended Reality (XR), e. g., lack of world-locked displays in no- or limited-display smart glasses and visual inconsistencies. To enable users to distinguish objects without visual feedback, SonoHaptics employs the concept of cross-modal correspondence in human perception to map visual features of objects (color, size, position, material) to audio-haptic properties (pitch, amplitude, direction, timbre). We contribute data-driven models for determining cross-modal mappings of visual features to audio and haptic features, and a computational approach to automatically generate audio-haptic feedback for objects in the user's environment. SonoHaptics provides global feedback that is unique to each object in the scene, and local feedback to amplify differences between nearby objects. Our comparative evaluation shows that SonoHaptics enables accurate object identification and selection in a cluttered scene without visual feedback.
引用
收藏
页数:19
相关论文
共 35 条
  • [1] Gaze-Based Object Segmentation
    Shi, Ran
    Ngan, Ngi King
    Li, Hongliang
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (10) : 1493 - 1497
  • [2] Gaze-based Cursor Control Impairs Performance in Divided Attention
    Rill, Robert Adrian
    Farago, Kinga Bettina
    ACTA CYBERNETICA, 2018, 23 (04): : 1071 - 1087
  • [3] An Audio-Haptic Interface Concept Based on Depth Information
    Devallez, Delphine
    Rocchesso, Davide
    Fontana, Federico
    HAPTIC AND AUDIO INTERACTION DESIGN, 2008, 5270 : 102 - +
  • [4] Gaze-based Object Detection in the Wild
    Weber, Daniel
    Fuhl, Wolfgang
    Zell, Andreas
    Kasneci, Enkelejda
    2022 SIXTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING, IRC, 2022, : 62 - 66
  • [5] A passive BCI for monitoring the intentionality of the gaze-based moving object selection
    Zhao, Darisy G.
    Vasilyev, Anatoly N.
    Kozyrskiy, Bogdan L.
    Melnichuk, Eugeny, V
    Isachenko, Andrey, V
    Velichkovsky, Boris M.
    Shishkin, Sergei L.
    JOURNAL OF NEURAL ENGINEERING, 2021, 18 (02)
  • [6] An Adaptive Model of Gaze-based Selection
    Chen, Xiuli
    Acharya, Aditya
    Oulasvirta, Antti
    Howes, Andrew
    CHI '21: PROCEEDINGS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2021,
  • [7] Dwell time preferences for gaze-based object selection of different object types vary with age
    Remijn, Gerard
    Paulus, Yesaya Tommy
    PERCEPTION, 2022, 51 : 155 - 155
  • [8] Optimum object selection methods for spontaneous gaze-based interaction with linear and circular trajectories
    Nurlatifa, Hafzatin
    Hartanto, Rudy
    Ataka, Ahmad
    Wibirama, Sunu
    RESULTS IN ENGINEERING, 2024, 21
  • [9] Conflicting Audio-haptic Feedback in Physically Based Simulation of Walking Sounds
    Turchet, Luca
    Serafin, Stefania
    Dimitrov, Smilen
    Nordahl, Rolf
    HAPTIC AND AUDIO INTERACTION DESIGN, 2010, 6306 : 97 - 106
  • [10] Suggesting Gaze-based Selection for Surveillance Applications
    Hild, Jutta
    Peinsipp-Byma, Elisabeth
    Voit, Michael
    Beyerer, Juergen
    2019 16TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2019,