An interactive VR system based on full-body tracking and gesture recognition

被引:0
|
作者
Zeng, Xia [1 ]
Sang, Xinzhu [1 ]
Chen, Duo [1 ]
Wang, Peng [1 ]
Guo, Nan [1 ]
Yan, Binbin [1 ]
Wang, Kuiru [1 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Informat Photon & Opt Commun, POB 72, Beijing 100876, Peoples R China
来源
基金
美国国家科学基金会;
关键词
virtual reality; human-centered interaction; full-body tracking; gesture recognition; Microsoft Kinect; Unity3D; natural user interface; walking-in place;
D O I
10.1117/12.2247808
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Most current virtual reality (VR) interactions are realized with the hand-held input device which leads to a low degree of presence. There is other solutions using sensors like Leap Motion to recognize the gestures of users in order to interact in a more natural way, but the navigation in these systems is still a problem, because they fail to map the actual walking to virtual walking only with a partial body of the user represented in the synthetic environment. Therefore, we propose a system in which users can walk around in the virtual environment as a humanoid model, selecting menu items and manipulating with the virtual objects using natural hand gestures. With a Kinect depth camera, the system tracks the joints of the user, mapping them to a full virtual body which follows the move of the tracked user. The movements of the feet can be detected to determine whether the user is in walking state, so that the walking of model in the virtual world can be activated and stopped by means of animation control in Unity engine. This method frees the hands of users comparing to traditional navigation way using hand-held device. We use the point cloud data getting from Kinect depth camera to recognize the gestures of users, such as swiping, pressing and manipulating virtual objects. Combining the full body tracking and gestures recognition using Kinect, we achieve our interactive VR system in Unity engine with a high degree of presence.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices
    Chan, Liwei
    Hsieh, Chi-Hao
    Chen, Yi-Ling
    Yang, Shuo
    Huang, Da-Yuan
    Liang, Rong-Hao
    Chen, Bing-Yu
    CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2015, : 3001 - 3010
  • [22] Insights on the Impact of Physical Impairments in Full-Body Motion Gesture Elicitation Studies
    Altakrouri, Bashar
    Burmeister, Daniel
    Boldt, Dennis
    Schrader, Andreas
    PROCEEDINGS OF THE NORDICHI '16: THE 9TH NORDIC CONFERENCE ON HUMAN-COMPUTER INTERACTION - GAME CHANGING DESIGN, 2016,
  • [23] A Study for the Identification of a Full-Body Gesture Language for Enabling Natural User Interaction
    Cespedes-Hernandez, David
    Gonzalez-Calleros, Juan Manuel
    HUMAN-COMPUTER INTERACTION, HCI-COLLAB 2019, 2019, 1114 : 42 - 56
  • [24] Affordable Personalized, Immersive VR Motor Rehabilitation System with Full Body Tracking
    Adolf, Jindrich
    Dolezal, Jaromir
    Lhotska, Lenka
    PHEALTH 2019, 2019, 261 : 75 - 81
  • [25] Gesture recognition using laser-based tracking system
    Perrin, S
    Cassinelli, A
    Ishikawa, M
    SIXTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, 2004, : 541 - 546
  • [26] Integrating Head and Full-Body Tracking for Embodiment in Virtual Characters
    Borland, David
    2013 IEEE VIRTUAL REALITY CONFERENCE (VR), 2013, : 81 - 82
  • [27] An Adaptive Superpixel Based Hand Gesture Tracking and Recognition System
    Zhu, Hong-Min
    Pun, Chi-Man
    SCIENTIFIC WORLD JOURNAL, 2014,
  • [28] Interaction Design of Full-Body Interactive Play Experiences for Children with Autism
    Crowell, Ciera
    PROCEEDINGS OF THE 2018 ANNUAL SYMPOSIUM ON COMPUTER-HUMAN INTERACTION IN PLAY COMPANION EXTENDED ABSTRACTS (CHI PLAY 2018), 2018, : 11 - 15
  • [29] A Full-Body Layered Deformable Model for Automatic Model-Based Gait Recognition
    Haiping Lu
    Konstantinos N. Plataniotis
    Anastasios N. Venetsanopoulos
    EURASIP Journal on Advances in Signal Processing, 2008
  • [30] Visual-tactile fusion gait recognition based on full-body gait model
    Li Y.
    Ji W.
    Dai S.
    Harbin Gongye Daxue Xuebao/Journal of Harbin Institute of Technology, 2022, 54 (01): : 88 - 95