SUPPORTING NAVIGATION OF OUTDOOR SHOPPING COMPLEXES FOR VISUALLY-IMPAIRED USERS THROUGH MULTI-MODAL DATA FUSION

被引:0
|
作者
Paladugu, Archana [1 ]
Chandakkar, Parag S. [1 ]
Zhang, Peng [1 ]
Li, Baoxin [1 ]
机构
[1] Arizona State Univ, Tempe, AZ 85287 USA
关键词
Visual Impairment; Outdoor navigation; Touch Interface; GPS; Design; User Study; REGISTRATION;
D O I
暂无
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Outdoor shopping complexes (OSC) are extremely difficult for people with visual impairment to navigate. Existing GPS devices are mostly designed for roadside navigation and seldom transition well into an OSC-like setting. We report our study on the challenges faced by a blind person in navigating OSC through developing a new mobile application named iExplore. We first report an exploratory study aiming at deriving specific design principles for building this system by learning the unique challenges of the problem. Then we present a methodology that can be used to derive the necessary information for the development of iExplore, followed by experimental validation of the technology by a group of visually impaired users in a local outdoor shopping center. User feedback and other performance metrics collected from the experiments suggest that iExplore, while at its very initial phase, has the potential of filling a practical gap in existing assistive technologies for the visually impaired.
引用
收藏
页数:7
相关论文
共 8 条
  • [1] A heterogeneous multi-modal medical data fusion framework supporting hybrid data exploration
    Zhang, Yong
    Sheng, Ming
    Liu, Xingyue
    Wang, Ruoyu
    Lin, Weihang
    Ren, Peng
    Wang, Xia
    Zhao, Enlai
    Song, Wenchao
    [J]. HEALTH INFORMATION SCIENCE AND SYSTEMS, 2022, 10 (01)
  • [2] A heterogeneous multi-modal medical data fusion framework supporting hybrid data exploration
    Yong Zhang
    Ming Sheng
    Xingyue Liu
    Ruoyu Wang
    Weihang Lin
    Peng Ren
    Xia Wang
    Enlai Zhao
    Wenchao Song
    [J]. Health Information Science and Systems, 10
  • [3] Multi-Sensor Data Fusion Solutions for Blind and Visually Impaired: Research and Commercial Navigation Applications for Indoor and Outdoor Spaces
    Theodorou, Paraskevi
    Tsiligkos, Kleomenis
    Meliones, Apostolos
    [J]. SENSORS, 2023, 23 (12)
  • [4] A hybrid multi-modal visual data cross fusion network for indoor and outdoor scene segmentation
    Hu, Sijie
    Bonardi, Fabien
    Bouchafa, Samia
    Sidibe, Desire
    [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2539 - 2545
  • [5] Improving motion sickness severity classification through multi-modal data fusion
    Dennison, Mark
    D'Zmura, Mike
    Harrison, Andre
    Lee, Michael
    Raglin, Adrienne
    [J]. ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING FOR MULTI-DOMAIN OPERATIONS APPLICATIONS, 2019, 11006
  • [6] Unraveling Diagnostic Biomarkers of Schizophrenia Through Structure-Revealing Fusion of Multi-Modal Neuroimaging Data
    Acar, Evrim
    Schenker, Carla
    Levin-Schwartz, Yuri
    Calhoun, Vince D.
    Adali, Tulay
    [J]. FRONTIERS IN NEUROSCIENCE, 2019, 13
  • [7] Smartphone-Based 3D Indoor Pedestrian Positioning through Multi-Modal Data Fusion
    Zhao, Hongyu
    Cheng, Wanli
    Yang, Ning
    Qiu, Sen
    Wang, Zhelong
    Wang, Jianjun
    [J]. SENSORS, 2019, 19 (20)
  • [8] E-Scooter Dynamics: Unveiling Rider Behaviours and Interactions with Road Users through Multi-Modal Data Analysis
    Kegalle, Hiruni
    Hettiachchi, Danula
    Chan, Jeffrey
    Salim, Flora
    Sanderson, Mark
    [J]. AUGMENTED HUMANS 2024, AHS 2024, 2024, : 307 - 310