Implementation of multi-modal interface for VR application

被引:2
|
作者
Machidori, Yushi [1 ]
Takayama, Ko [1 ]
Sugita, Kaoru [2 ]
机构
[1] Fukuoka Inst Technol, Grad Sch Engn, Fukuoka, Japan
[2] Fukuoka Inst Technol, Dept Informat & Commun Engn, Fukuoka, Japan
关键词
Virtual Reality (VR); multi-modal interface; hand tracking; gesture interface; voice recognition;
D O I
10.1109/icawst.2019.8923551
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, some Head Mount Displays (HMD) are released for consumers. A general VR system is provided to a virtual experience with the virtual world according to user's responses organized by three types of components such as an input system, an output system and a simulation system. The input system is used as a controller, a mouse, a keyboard and head tracking device. These devices are used to physical operations on the real world, but these devices are invisible on the virtual world during using the HMD. In this paper, we introduce a multi-modal interface to VR application supporting both voice and gesture interface on a general HMD. We also discuss about a prototype system to use low cost devices such as the HMD, a gesture input device, a general PC and a USB microphone.
引用
收藏
页码:370 / 373
页数:4
相关论文
共 50 条
  • [1] An implementation of multi-modal game interface based on PDAs
    Lee, Kue-Bum
    Kim, Jung-Hyun
    Hong, Kwang-Seok
    [J]. SERA 2007: 5TH ACIS INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING RESEARCH, MANAGEMENT, AND APPLICATIONS, PROCEEDINGS, 2007, : 759 - +
  • [2] Implementation of ActiveCube for multi-modal interaction
    Itoh, Y
    Kitamura, Y
    Kawai, M
    Kishino, F
    [J]. HUMAN-COMPUTER INTERACTION - INTERACT'01, 2001, : 682 - 683
  • [3] Development of a multi-modal personal authentication interface
    Kim, Sung-Phil
    Kang, Jae-Hwan
    Jo, Young Chang
    Oakley, Ian
    [J]. 2017 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC 2017), 2017, : 712 - 715
  • [4] Design and implementation of a multi-modal user interface of the virtual world database system (VWDB)
    Masunaga, Y
    Watanabe, C
    [J]. SEVENTH INTERNATIONAL CONFERENCE ON DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, PROCEEDINGS, 2001, : 294 - 301
  • [5] TouchVR: a Wearable Haptic Interface for VR Aimed at Delivering Multi-modal Stimuli at the User's Palm
    Trinitatova, Daria
    Tsetserukou, Dzmitry
    [J]. SA'19: SIGGRAPH ASIA 2019 XR, 2019, : 42 - 43
  • [6] Multi-Modal Supervision Interface Concept for Marine Systems
    Nad, Dula
    Miskovic, Nikola
    Omerdic, Edin
    [J]. OCEANS 2019 - MARSEILLE, 2019,
  • [7] VIRTUAL REALITY INTERFACE DESIGN FOR MULTI-MODAL TELEOPERATION
    Kadavasal, Muthukkumar S.
    Oliver, James H.
    [J]. WINVR2009: PROCEEDINGS OF THE ASME/AFM WORLD CONFERENCE ON INNOVATIVE VIRTUAL REALITY - 2009, 2009, : 169 - 174
  • [8] A Group Recommendation System with a Multi-modal User Interface
    Yukawa, Masataka
    Hayashi, Yugo
    Ogawa, Hitoshi
    Kryssanov, Victor V.
    [J]. 6TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS, AND THE 13TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS, 2012, : 2158 - 2163
  • [9] Enabling Multi-modal Conversational Interface for Clinical Imaging
    Dayanandan, Kailas
    Lall, Brejesh
    [J]. EXTENDED ABSTRACTS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2024, 2024,
  • [10] A multi-modal haptic interface for virtual reality and robotics
    Folgheraiter, Michele
    Gini, Giuseppina
    Vercesi, Dario
    [J]. JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2008, 52 (3-4) : 465 - 488