Indoor navigation system for human based on multi-site vibrotactile feedback

被引:0
|
作者
Shi S. [1 ]
Song G. [1 ]
Liu S. [2 ]
Wei Z. [1 ]
Li S. [1 ]
Song A. [1 ]
机构
[1] School of Instrument Science and Engineering, Southeast University, Nanjing
[2] Electric Power Dispatching and Control Center, Jiangsu Electric Power Company, Nanjing
关键词
Autonomous navigation robot; Human body tracking; Human-robot cooperative navigation algorithm; Multi-site tactile feedback;
D O I
10.3969/j.issn.1001-0505.2019.01.015
中图分类号
学科分类号
摘要
To improve the navigation performance of the traditional navigation system for human, a navigation system for human based on multi-site vibrotactile feedback is designed. The system is based on an autonomous navigation mobile robot. The human-robot cooperative navigation algorithm which gives full consideration to the user's autonomy and comfort is applied to make the user to freely decide his/her own speed. By using statistical methods to research the ability for vibrotactile perception of body parts, it's found that fingers and wrists are suitable for feeling vibrotactile signals. Therefore, a multi-site vibrotactile device is developed as a human-robot interface. This human-robot interface and human-robot cooperative navigation algorithm is utilized to make the system implement autonomous navigation for people. The experimental results of system performance validation show that the navigation system based on multi-site vibrotactile feedback has good anti-jamming capability. The mean of the formation error is 0.23 m, and the navigation speed is 0.13 m/s. It has better performance than the navigation system based on the kinaesthetic and the one based on the single-site vibrotactile feedback. © 2019, Editorial Department of Journal of Southeast University. All right reserved.
引用
收藏
页码:101 / 109
页数:8
相关论文
共 18 条
  • [1] Ghiani G., Leporini B., Paterno F., Vibrotactile feedback to aid blind users of mobile guides, Journal of Visual Languages & Computing, 20, 5, pp. 305-317, (2009)
  • [2] Mekhalfi M.L., Melgani F., Zeggada A., Et al., Recovering the sight to blind people in indoor environments with smart technologies, Expert Systems with Applications, 46, pp. 129-138, (2016)
  • [3] Ni D.J., Song A.G., Tian L., Et al., A walking assistant robotic system for the visually impaired based on computer vision and tactile perception, International Journal of Social Robotics, 7, 5, pp. 617-628, (2015)
  • [4] Wachaja A., Agarwal P., Zink M., Et al., Navigating blind people with walking impairments using a smart walker, Autonomous Robots, 41, 3, pp. 555-573, (2017)
  • [5] Lykawka C., Stahl B.K., Campos M.D.B., Et al., Tactile interface design for helping mobility of people with visual disabilities, 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC), pp. 851-860, (2017)
  • [6] Filipe V., Fernandes F., Fernandes H., Et al., Blind navigation support system based on microsoft Kinect, Procedia Computer Science, 14, pp. 94-101, (2012)
  • [7] Galatas G., McMurrough C., Mariottini G.L., Et al., EyeDog: An assistive-guide robot for the visually impaired, Proceedings of the 4th International Conference on Pervasive Technologies Related to Assistive Environments, (2011)
  • [8] Lazzouni L., Lepore F., Compensatory plasticity: Time matters, Frontiers in Human Neuroscience, 8, (2014)
  • [9] Penders J., Ghosh A., Human robot interaction in the absence of visual and aural feedback: Exploring the haptic sense, Procedia Computer Science, 71, pp. 185-195, (2015)
  • [10] Peng H., Song G.M., You J., Et al., An indoor navigation service robot system based on vibration tactile feedback, International Journal of Social Robotics, 9, 3, pp. 331-341, (2017)