A Multi-Sensor Fusion System for Improving Indoor Mobility of the Visually Impaired

被引:0
|
作者
Zhao, Yu [1 ]
Huang, Ran [1 ]
Hu, Biao [1 ]
机构
[1] Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China
基金
中国国家自然科学基金;
关键词
Assistive navigation; semantic SLAM; visually impaired; BLIND PEOPLE; NAVIGATION; FRAMEWORK;
D O I
10.1109/cac48633.2019.8996578
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Independent movement in an unknown indoor environment is a challenging task for the visually impaired. By considering the connectivity of the corridor (room doors and stairs are all connected by the corridor), we propose an assistive navigation system to help visually impaired users navigate in corridor environments in this paper. Based on semantic simultaneous localization and mapping (SLAM), the corridor area is determined and mapped to a semantic map. Semantic path planning is then performed according to the lowest energy cost principle by taking the safety into consideration. The YOLO neural network is employed to detect and identify common indoor landmarks such as Toilet, EXIT, Staircase et al., and the system is able to give voice feedback about objects along its line of sight during the movement. This interaction helps to enhance perception about objects and places to improve travel decisions. A TurtleBot2 robot with a laptop, a RPLIDAR A2, and a Microsoft Kinect V1 are utilized to validate the localization, mapping and navigation module, while the perception module uses a ZED stereo camera to capture the objects and landmarks along its line of sight. The software modules of this system are implemented in Robot Operating System (ROS) and tested in our lab building.
引用
收藏
页码:2950 / 2955
页数:6
相关论文
共 50 条
  • [1] Guidance System for Visually Impaired Persons Using Multi-Sensor Fusion
    Lee, Jin-Hee
    Shin, Byeong-Seok
    JOURNAL OF INTERNET TECHNOLOGY, 2015, 16 (04): : 747 - 754
  • [2] Indoor Positioning System based on Sensor Fusion for the Blind and Visually Impaired
    Gallagher, Thomas
    Wise, Elyse
    Li, Binghao
    Dempster, Andrew G.
    Rizos, Chris
    Ramsey-Stewart, Euan
    2012 INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN), 2012,
  • [3] Multi-Sensor Data Fusion Solutions for Blind and Visually Impaired: Research and Commercial Navigation Applications for Indoor and Outdoor Spaces
    Theodorou, Paraskevi
    Tsiligkos, Kleomenis
    Meliones, Apostolos
    SENSORS, 2023, 23 (12)
  • [4] An iBeacon Indoor Positioning System based on Multi-sensor Fusion
    Shao, Shuai
    Shuo, Nan
    Kubota, Naoyuki
    2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 1115 - 1120
  • [5] An Intelligent Actuator of an Indoor Logistics System Based on Multi-Sensor Fusion
    Wang, Pangwei
    Wang, Yunfeng
    Wang, Xu
    Liu, Ying
    Zhang, Juan
    ACTUATORS, 2021, 10 (06)
  • [6] Improving Mobility for the Visually Impaired
    Lee, Chih-Wei
    Chondro, Peter
    Ruan, Shanq-Jang
    Christen, Oliver
    Naroska, Edwin
    IEEE CONSUMER ELECTRONICS MAGAZINE, 2018, 7 (03) : 12 - 20
  • [7] Multi-Sensor Fusion Approach for Improving Map-Based Indoor Pedestrian Localization
    Huang, Hsiang-Yun
    Hsieh, Chia-Yeh
    Liu, Kai-Chun
    Cheng, Hui-Chun
    Hsu, Steen J.
    Chan, Chia-Tai
    SENSORS, 2019, 19 (17)
  • [8] Multi-sensor Fusion for Autonomous Positioning of Indoor Robots
    Shuai, Zipei
    Yu, Hongyang
    PROCEEDINGS OF THE 34TH INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2021), 2021, : 105 - 112
  • [9] Research on Omnidirectional Indoor Mobile Robot System Based on Multi-sensor Fusion
    Tan, Xiangquan
    Zhang, Shuliang
    Wu, Qingwen
    2021 5TH INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING (ICVISP 2021), 2021, : 111 - 117
  • [10] Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device
    He, Xiang
    Aloi, Daniel N.
    Li, Jia
    SENSORS, 2015, 15 (12) : 31464 - 31481