Precision position tracking in virtual reality environments using sensor networks

被引:3
|
作者
Gulez, Tauseef [1 ]
Kavakli, Manolya [1 ]
机构
[1] Macquarie Univ, Dept Comp, Interact Syst & Virtual Reality Res Grp, Sydney, NSW 2109, Australia
关键词
D O I
10.1109/ISIE.2007.4374914
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In an immersive interactive virtual reality (VR) environment a real human can be incorporated into a virtual, 3D scene to navigate a robotic device within that virtual scene. This has useful applications in rehabilitation. The non-destructive nature of VR makes it an ideal testbed for many applications and a prime candidate for use in rehabilitation robotics simulation. The key challenge is to accurately localise the movement of the object in reality and map its corresponding position in 3D VR. To solve the localisation problem we have formed an online mode vision sensor network, which tracks the object's real Euclidean position and sends the information back to the VR scene. A precision position tracking (PPT) system has been installed to track the object. We have previously presented the solution to the sensor relevance establishment problem (in [10], [11]) where from a group of sensors the, most relevant sensing action is obtained. In this paper we apply the same technique to the VR system. The problem. can be broken down in two steps. In step one, the relevant sensor type is discovered based upon the IEEE 1451.4 Transducers Electronic Data Sheets (TEDS) description model. TEDS is used to discover the sensor types, their geographical locations, and additional information such as uncertainty. measurement functions and information fusion rules necessary. to fuse multi-sensor data. In step two, the most useful sensor information is obtained using the Kullback-Leibler Divergence (KLD) method. In this study we conduct two experiments that address the localisation problem. In the first experiment a VR 3D environment is created using the real-time distributed robotics software Player/Stage/Gazebo and a simulated PPT camera system is used to localise a simulated autonomous mobile robot within the 3D environment. In the second experiment, a real user is placed in a cave-like VR 3D environment and a real PPT camera system is used to localise the user's physical actions in reality. The physical actions of the real user are then used to control the robotic device in VR.
引用
收藏
页码:1997 / 2003
页数:7
相关论文
共 50 条
  • [11] Virtual reality in puppet game using depth sensor of gesture recognition and tracking
    Yohannes, Ervin
    Shih, Timothy K.
    Utaminingrum, Fitri
    [J]. Journal of Computers (Taiwan), 2020, 31 (05) : 89 - 98
  • [12] Understanding of Virtual Reality with Visual Sensor Networks
    Hou, Pengcheng
    [J]. 2018 3RD INTERNATIONAL CONFERENCE ON COMMUNICATION, IMAGE AND SIGNAL PROCESSING, 2019, 1169
  • [13] Enhanced location tracking in sensor fusion-assisted virtual reality micro-manipulation environments
    Prada, John David Prieto
    Im, Jintaek
    Oh, Hyondong
    Song, Cheol
    [J]. PLOS ONE, 2021, 16 (12):
  • [14] Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality
    Barkevich, Kevin
    Bailey, Reynold
    Diaz, Gabriel J.
    [J]. PROCEEDINGS OF THE ACM ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES, 2024, 7 (02)
  • [15] THE POWER OF SIGHT: USING EYE TRACKING TO ASSESS LEARNING EXPERIENCE (LX) IN VIRTUAL REALITY ENVIRONMENTS
    Luis Soler, Jose
    Ferreira Cavalcanti, Janaina
    Contero, Manuel
    Alcaniz, Mariano
    [J]. INTED2017: 11TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE, 2017, : 8684 - 8689
  • [16] Mediated presence: virtual reality, mixed environments and social networks
    Anna Spagnolli
    Matthew Lombard
    Luciano Gamberini
    [J]. Virtual Reality, 2009, 13 : 137 - 139
  • [17] Mediated presence: virtual reality, mixed environments and social networks
    Spagnolli, Anna
    Lombard, Matthew
    Gamberini, Luciano
    [J]. VIRTUAL REALITY, 2009, 13 (03) : 137 - 139
  • [18] Virtual reality and mixed reality for virtual learning environments
    Pan, ZG
    Cheok, AD
    Yang, HW
    Zhu, JJ
    Shi, JY
    [J]. COMPUTERS & GRAPHICS-UK, 2006, 30 (01): : 20 - 28
  • [19] Extending Virtual Reality Display Wall Environments Using Augmented Reality
    Nishimoto, Arthur
    Johnson, Andrew
    [J]. ACM CONFERENCE ON SPATIAL USER INTERACTION (SUI 2019), 2019,
  • [20] Evaluation of Adaptability to Unfamiliar Environments Using Virtual Reality
    Isezaki, Takashi
    Watanabe, Tomoki
    [J]. NTT Technical Review, 2021, 19 (03): : 49 - 52