Supporting interactive data exploration for GIS planning tasks with a multi-modal virtual environment

被引:1
|
作者
Harding, C [1 ]
Newcomb, M [1 ]
机构
[1] Iowa State Univ, VRAC, Human Comp Interact Program, Ames, IA 50011 USA
关键词
D O I
10.1109/HAVE.2004.1391886
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We are investigating how combining 3D stereovision, touch and sound into a multi-modal Virtual Environment can be used to improve interaction with and analysis of spatial (3D) data. We specifically focus on providing tools for planning new structures (such as schools or pipelines) within a system of spatial constraints. The planning task is typically performed within a GIS (Geographic Information System), where it is called suitability analysis. Our proof-of-concept Virtual Environment uses 3D stereo, force-feedback and interactive sound. We use a set of typical GIS raster and vector data and drape the data on a touchable Digital Elevation Model (3D terrain). In addition, the system lets the user configure a planning environment where the relative importance of each GIS layer can be expressed via force and/or via sound. For example, when digitizing a path, the user could model proximity to objects such as roads or houses as repulsion (with different intensities, depending on the object's importance) and hear land-cover values as sound (pitch). We can then substitute multiple layers of potentially cluttering 2D maps with a combination of vision, force (gravity, friction) and sound (pitch, tempo, timbre) and facilitate the fusion of data streams from different sensory modalities. With the help of students is a ISU GIS class, we intend to formally evaluate this setup and compare it with the traditional GIS suitability analysis.
引用
收藏
页码:81 / 86
页数:6
相关论文
共 50 条
  • [1] A heterogeneous multi-modal medical data fusion framework supporting hybrid data exploration
    Yong Zhang
    Ming Sheng
    Xingyue Liu
    Ruoyu Wang
    Weihang Lin
    Peng Ren
    Xia Wang
    Enlai Zhao
    Wenchao Song
    [J]. Health Information Science and Systems, 10
  • [2] A heterogeneous multi-modal medical data fusion framework supporting hybrid data exploration
    Zhang, Yong
    Sheng, Ming
    Liu, Xingyue
    Wang, Ruoyu
    Lin, Weihang
    Ren, Peng
    Wang, Xia
    Zhao, Enlai
    Song, Wenchao
    [J]. HEALTH INFORMATION SCIENCE AND SYSTEMS, 2022, 10 (01)
  • [3] Methods of Multi-Modal Data Exploration
    Grosup, Tomas
    [J]. ICMR'19: PROCEEDINGS OF THE 2019 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, 2019, : 34 - 37
  • [4] A MULTI-MODAL VIRTUAL ENVIRONMENT TO TRAIN FOR JOB INTERVIEW
    Hamdi, Hamza
    Richard, Paul
    Suteau, Aymeric
    Saleh, Mehdi
    [J]. PECCS 2011: PROCEEDINGS OF THE 1ST INTERNATIONAL CONFERENCE ON PERVASIVE AND EMBEDDED COMPUTING AND COMMUNICATION SYSTEMS, 2011, : 551 - 556
  • [5] Software infrastructure for interactive, multi-modal virtual and augmented realities
    Martin, GA
    Daly, J
    Washburn, DA
    Lazarus, T
    Goldiez, B
    [J]. ISAS/CITSA 2004: International Conference on Cybernetics and Information Technologies, Systems and Applications and 10th International Conference on Information Systems Analysis and Synthesis, Vol 4, Proceedings, 2004, : 13 - 18
  • [6] Interactive medical visualization and simulation for multi-modal virtual learning
    Sun, Hanqiu
    Choi, Kup-Sze
    Bai, Ling
    [J]. Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2004, 16 (07): : 898 - 908
  • [7] Interactive multi-modal suturing
    Payandeh, Shahram
    Shi, Fuhan
    [J]. VIRTUAL REALITY, 2010, 14 (04) : 241 - 253
  • [8] Interactive multi-modal suturing
    Shahram Payandeh
    Fuhan Shi
    [J]. Virtual Reality, 2010, 14 : 241 - 253
  • [9] Interactive Multi-Modal Motion Planning With Branch Model Predictive Control
    Chen, Yuxiao
    Rosolia, Ugo
    Ubellacker, Wyatt
    Csomay-Shanklin, Noel
    Ames, Aaron D.
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 5365 - 5372
  • [10] A Collaborative Interaction and Visualization Multi-Modal Environment for Surgical Planning
    Foo, Jung Leng
    Martinez-Escobar, Marisol
    Peloquin, Catherine
    Lobe, Thom
    Winer, Eliot
    [J]. MEDICINE MEETS VIRTUAL REALITY 17 - NEXTMED: DESIGN FOR/THE WELL BEING, 2009, 142 : 97 - 102