Reducing the minimum range of a RGB-depth sensor to aid navigation in visually impaired individuals

被引:15
|
作者
Yang, Kailun [1 ]
Wang, Kaiwei [1 ]
Chen, Hao [1 ]
Bai, Jian [1 ]
机构
[1] Zhejiang Univ, Coll Opt Sci & Engn, State Key Lab Modern Opt Instrumentat, Hangzhou 310027, Zhejiang, Peoples R China
关键词
AWARENESS; VISION;
D O I
10.1364/AO.57.002809
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
The introduction of RGB-depth (RGB-D) sensors harbors a revolutionary power in the field of navigational assistance for the visually impaired. However, RGB-D sensors are limited by a minimum detectable distance of about 800 mm. This paper proposes an effective approach to decrease the minimum range for navigational assistance based on a RGB-D sensor of RealSense R200. A large-scale stereo matching between two infrared (IR) images and a cross-modal stereo matching between one IR image and RGB image are incorporated for short-range depth acquisition. The minimum range reduction is critical not only for avoiding obstacles up close, but also in the enhancement of traversability awareness. Overall, the minimum detectable distance of RealSense is reduced from 650 mm to 60 mm with qualified accuracy. A traversable line is created to give feedback to visually impaired individuals through stereo sound. The approach is proved to have usefulness and reliability by a comprehensive set of experiments and field tests in real-world scenarios involving real visually impaired participants. (C) 2018 Optical Society of America
引用
收藏
页码:2809 / 2819
页数:11
相关论文
共 50 条
  • [1] Fusion of Millimeter wave Radar and RGB-Depth sensors for assisted navigation of the visually impaired
    Long, Ningbo
    Wang, Kaiwei
    Cheng, Ruiqi
    Yang, Kailun
    Bai, Jian
    [J]. MILLIMETRE WAVE AND TERAHERTZ SENSORS AND TECHNOLOGY XI, 2018, 10800
  • [2] Navigation Assistance for the Visually Impaired Using RGB-D Sensor With Range Expansion
    Aladren, A.
    Lopez-Nicolas, G.
    Puig, Luis
    Guerrero, Josechu J.
    [J]. IEEE SYSTEMS JOURNAL, 2016, 10 (03): : 922 - 932
  • [3] Assisting the visually impaired: multitarget warning through millimeter wave radar and RGB-depth sensors
    Long, Ningbo
    Wang, Kaiwei
    Cheng, Ruiqi
    Yang, Kailun
    Hu, Weijian
    Bai, Jian
    [J]. JOURNAL OF ELECTRONIC IMAGING, 2019, 28 (01)
  • [4] A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor
    Kanwal, Nadia
    Bostanci, Erkan
    Currie, Keith
    Clark, Adrian F.
    [J]. APPLIED BIONICS AND BIOMECHANICS, 2015, 2015
  • [5] Depth-Aided Robust Localization Approach for Relative Navigation using RGB-Depth Camera and LiDAR Sensor
    Song, Ha-Ryong
    Choi, Won-sub
    Kim, Hae-dong
    [J]. 2014 INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES (ICCAIS 2014), 2014, : 105 - 110
  • [6] An Indoor Navigation Aid for the Visually Impaired
    Zhang, He
    Ye, Cang
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2016, : 467 - 472
  • [7] NAVI: Navigation Aid for the Visually Impaired
    Sharma, Tarun
    Apoorva, J. H. M.
    Lakshmanan, Ramananathan
    Gogia, Prakruti
    Kondapaka, Manoj
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND AUTOMATION (ICCCA), 2016, : 971 - 976
  • [8] Unifying obstacle detection, recognition, and fusion based on millimeter wave radar and RGB-depth sensors for the visually impaired
    Long, Ningbo
    Wang, Kaiwei
    Cheng, Ruiqi
    Hu, Weijian
    Yang, Kailun
    [J]. REVIEW OF SCIENTIFIC INSTRUMENTS, 2019, 90 (04):
  • [9] Interactive sonification of U-depth images in a navigation aid for the visually impaired
    Skulimowski, Piotr
    Owczarek, Mateusz
    Radecki, Andrzej
    Bujacz, Michal
    Rzeszotarski, Dariusz
    Strumillo, Pawel
    [J]. JOURNAL ON MULTIMODAL USER INTERFACES, 2019, 13 (03) : 219 - 230
  • [10] Interactive sonification of U-depth images in a navigation aid for the visually impaired
    Piotr Skulimowski
    Mateusz Owczarek
    Andrzej Radecki
    Michal Bujacz
    Dariusz Rzeszotarski
    Pawel Strumillo
    [J]. Journal on Multimodal User Interfaces, 2019, 13 : 219 - 230