Environment-Independent VR Development

被引:0
|
作者
Kreylos, Oliver [1 ]
机构
[1] Univ Calif Davis, WM Keck Ctr Act Visualizat Earth Sci, Davis, CA 95616 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Vrui (Virtual Reality User Interface) is a C++ development toolkit for highly interactive and high-performance VR applications, aimed at producing completely environment-independent software. Vrui not only hides differences between display systems and multi-pipe rendering approaches, but also separates applications from the input devices available at any environment. Instead of directly referencing input devices, e. g., by name, Vrui applications work with an intermediate tool layer that expresses interaction with input devices at a higher semantic level. This allows environment integrators to provide tools to map the available input devices to semantic events such as selection, location, dragging, navigation, menu selection, etc., in the most efficient and intuitive way possible. As a result, Vrui applications run effectively on widely different VR environments, ranging from desktop systems with only keyboard and mouse to fully-immersive multi-screen systems with multiple 6-DOF input devices. Vrui applications on a desktop are not run in a "simulator" mode mostly useful for debugging, but are fully usable and look and feel similar to native desktop applications.
引用
收藏
页码:901 / 912
页数:12
相关论文
共 50 条
  • [31] mmASL: Environment-Independent ASL Gesture Recognition Using 60 GHz Millimeter-wave Signals
    Santhalingam, Panneer Selvam
    Hosain, Al Amin
    Zhang, Ding
    Pathak, Parth
    Rangwala, Huzefa
    Kushalnagar, Raja
    PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2020, 4 (01):
  • [32] Environment-independent textile fiber identification using Wi-Fi channel state information
    Zhang, Huihui
    Gu, Lin
    TEXTILE RESEARCH JOURNAL, 2024, 94 (11-12) : 1316 - 1333
  • [33] Real-time Identification of Rogue WiFi Connections Using Environment-Independent Physical Features
    Liu, Pengfei
    Yang, Panlong
    Song, Wen-Zhan
    Yan, Yubo
    Li, Xiang-Yang
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2019), 2019, : 190 - 198
  • [34] A Hybrid Image Augmentation Technique for User- and Environment-Independent Hand Gesture Recognition Based on Deep Learning
    Awaluddin, Baiti-Ahmad
    Chao, Chun-Tang
    Chiou, Juing-Shian
    MATHEMATICS, 2024, 12 (09)
  • [35] Flycon: Real-time Environment-independent Multi-view Human Pose Estimation with Aerial Vehicles
    Nageli, Tobias
    Oberholzer, Samuel
    Pluss, Silvan
    Alonso-Mora, Javier
    Hilliges, Otmar
    SIGGRAPH ASIA'18: SIGGRAPH ASIA 2018 TECHNICAL PAPERS, 2018,
  • [36] Flycon: Real-time Environment-independent Multi-view Human Pose Estimation with Aerial Vehicles
    Nageli, Tobias
    Oberholzer, Samuel
    Pluss, Silvan
    Alonso-Mora, Javier
    Hilliges, Otmar
    ACM TRANSACTIONS ON GRAPHICS, 2018, 37 (06):
  • [37] Poster: MobiEar-Building an Environment-independent Acoustic Sensing Platform for the Deaf using Deep Learning
    Liu, Sicong
    Du, Junzhao
    MOBISYS'16: COMPANION COMPANION PUBLICATION OF THE 14TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE SYSTEMS, APPLICATIONS, AND SERVICES, 2016, : 50 - 50
  • [38] Development of Intelligent Workout Environment for VR Devices
    Aslanyan, Minas
    2024 IEEE GLOBAL ENGINEERING EDUCATION CONFERENCE, EDUCON 2024, 2024,
  • [39] Towards Environment-Independent Activity Recognition Using Wi-Fi CSI with an Encoder-Decoder Network
    Sugimoto, Yu
    Rizk, Hamada
    Uchiyama, Akira
    Yamaguchi, Hirozumi
    PROCEEDINGS OF THE 2023 8TH WORKSHOP ON BODY-CENTRIC COMPUTING SYSTEMS, BODYSYS 2023, 2023, : 13 - 18