A Low-Cost Multi-Modal Auditory-Visual-Tactile Framework for Remote Touch

被引:10
|
作者
Sanfilippo, Filippo [1 ,2 ]
Pacchierotti, Claudio [3 ]
机构
[1] Univ Agder UiA, Dept Engn Sci, Jon Lilletuns Vei 9, N-4879 Grimstad, Norway
[2] Oslo Metropolitan Univ OsloMet, Dept Mech Elect & Chem Engn, POB 4 St Olays Plass, N-0130 Oslo, Norway
[3] Univ Rennes, CNRS, Inria, IRISA, Campus Univ Beaulieu, F-35042 Rennes, France
关键词
Wearable Haptics; Human-Computer Interaction; Multimodality;
D O I
10.1109/ICICT50521.2020.00040
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Haptic technology for human augmentation provides gains in ability for different applications, whether the aim is to enhance "disabilities" to "abilities", or "abilities" to "super-abilities". Commercially-available devices are generally expensive and tailored to specific applications and hardware. To give researchers a haptic feedback system that is economical, customisable, and fast to fabricate, our group developed a low-cost immersive haptic, audio, and visual experience built by using off-the-shelf (COTS) components. It is composed of a vibrotactile glove, a Leap Motion sensor, and an head-mounted display, integrated together to provide compelling immersive sensations. This paper proposes a higher technology readiness level (TRL) for the system to make it modular and reliable. To demonstrate its potential, we present two human subject studies in Virtual Reality. They evaluate the capability of the system in providing (i) guidance during simulated drone operations, and (ii) contact haptic feedback during virtual objects interaction. Results prove that the proposed haptic-enabled framework improves the performance and illusion of presence.
引用
收藏
页码:213 / 218
页数:6
相关论文
共 50 条
  • [1] A Multi-modal Auditory-Visual-Tactile e-Learning Framework
    Sanfilippo, Filippo
    Blazauskas, Tomas
    Girdziuna, Martynas
    Janonis, Airidas
    Kiudys, Eligijus
    Salvietti, Gionata
    INTELLIGENT TECHNOLOGIES AND APPLICATIONS, 2022, 1616 : 119 - 131
  • [2] Low-cost multi-modal microscope using Raspberry Pi
    Watanabe, Wataru
    Maruyama, Ryoji
    Arimoto, Hidenobu
    Tamada, Yosuke
    OPTIK, 2020, 212
  • [3] Accurate Pipeline Blockage Detection with Low-Cost Multi-Modal Sensing
    Zhang, Chengjie
    Heidemann, John
    2014 IEEE 11TH INTERNATIONAL CONFERENCE ON MOBILE AD HOC AND SENSOR SYSTEMS (MASS), 2014, : 172 - 180
  • [4] On the Feasibility of Low-Cost Wearable Sensors for Multi-Modal Biometric Verification
    Blasco, Jorge
    Peris-Lopez, Pedro
    SENSORS, 2018, 18 (09)
  • [5] A Low-Cost Multi-Modal Sensor Network for the Monitoring of Honeybee Colonies/Hives
    Howard, Donald
    Duran, Olga
    Hunter, Gordon
    INTELLIGENT ENVIRONMENTS 2018, 2018, 23 : 69 - 78
  • [6] MULTI-MODAL REMOTE SENSING DATA FUSION FRAMEWORK
    Ghaffar, M. A. A.
    Vu, T. T.
    Maul, T. H.
    FOSS4G-EUROPE 2017 - ACADEMIC TRACK, 2017, 42-4 (W2): : 85 - 89
  • [7] Tactile Codec with Visual Assistance in Multi-modal Communication for Digital Health
    Chen, Mingkai
    Tan, Xinmeng
    Han, Huiyan
    Wang, Lei
    MOBILE NETWORKS & APPLICATIONS, 2024, 29 (03): : 634 - 646
  • [8] Multi-Modal Atmospheric Data Collection using Low-Cost Particulate Matter Sensors
    Nails, Kaleb
    Smith, Leah
    Liebergall, Erik
    El-Sayed, Marwa
    Adkins, Kevin
    Compere, Marc
    AIAA AVIATION FORUM AND ASCEND 2024, 2024,
  • [9] "Filiform hallucinations": Multi-modal tactile and visual hallucinations in Parkinson's disease
    Perez-Perez, J.
    Pagonabarraga, J.
    Kulisevsky, J.
    MOVEMENT DISORDERS, 2013, 28 : S314 - S314
  • [10] Multi-modal robotic visual-tactile localisation and detection of surface cracks
    Palermo, Francesca
    Rincon-Ardila, Liz
    Oh, Changjae
    Althoefer, Kaspar
    Poslad, Stefan
    Venture, Gentiane
    Farkhatdinov, Ildar
    2021 IEEE 17TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2021, : 1806 - 1811