A Vision Dynamics Learning Approach to Robotic Navigation in Unstructured Environments

被引:1
|
作者
Ginerica, Cosmin [1 ]
Zaha, Mihai [1 ]
Floroian, Laura [1 ]
Cojocaru, Dorian [2 ]
Grigorescu, Sorin [1 ]
机构
[1] Transilvania Univ Brasov, Robot Vis & Control Lab ROVIS, Brasov 500036, Romania
[2] Univ Craiova, Dept Automat Control, Elect & Mechatron, Craiova 200585, Romania
关键词
deep learning; planning; recurrent neural network; robotics;
D O I
10.3390/robotics13010015
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Autonomous legged navigation in unstructured environments is still an open problem which requires the ability of an intelligent agent to detect and react to potential obstacles found in its area. These obstacles may range from vehicles, pedestrians, or immovable objects in a structured environment, like in highway or city navigation, to unpredictable static and dynamic obstacles in the case of navigating in an unstructured environment, such as a forest road. The latter scenario is usually more difficult to handle, due to the higher unpredictability. In this paper, we propose a vision dynamics approach to the path planning and navigation problem for a quadruped robot, which navigates in an unstructured environment, more specifically on a forest road. Our vision dynamics approach is based on a recurrent neural network that uses an RGB-D sensor as its source of data, constructing sequences of previous depth sensor observations and predicting future observations over a finite time span. We compare our approach with other state-of-the-art methods in obstacle-driven path planning algorithms and perform ablation studies to analyze the impact of architectural changes to our model components, demonstrating that our approach achieves superior performance in terms of successfully generating collision-free trajectories for the intelligent agent.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Towards Active Robotic Vision in Agriculture: A Deep Learning Approach to Visual Servoing in Occluded and Unstructured Protected Cropping Environments
    Zapotezny-Anderson, Paul
    Lehnert, Chris
    IFAC PAPERSONLINE, 2019, 52 (30): : 120 - 125
  • [2] Path Recognition Method of Robot Vision Navigation in Unstructured Environments
    Zhao Liming
    Ye Chuan
    Zhang Yi
    Xu Xiaodong
    Chen Jing
    ACTA OPTICA SINICA, 2018, 38 (08)
  • [3] Visual target detection in unstructured environments - A novel technique for robotic navigation
    Sluzek, Andrzej
    Islam, Md Saiful
    ROMANSY 16: ROBOT DESIGN, DYNAMICS , AND CONTROL, 2006, 487 : 431 - +
  • [4] Knowledge acquisition and learning in unstructured robotic assembly environments
    Lopez-Juarez, I
    Howarth, M
    INFORMATION SCIENCES, 2002, 145 (1-2) : 89 - 111
  • [5] Efficient Safe Learning for Robotic Systems in Unstructured Environments
    Pohland, Sara
    Herbert, Sylvia
    Tomlin, Claire
    2019 IEEE 16TH INTERNATIONAL CONFERENCE ON MOBILE AD HOC AND SENSOR SYSTEMS WORKSHOPS (MASSW 2019), 2019, : 82 - 86
  • [6] Navigation in complex unstructured environments
    Wijesoma, WS
    Perera, LDL
    Adams, MD
    2004 8TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION, VOLS 1-3, 2004, : 167 - 172
  • [7] Vision Based Autonomous Navigation in Unstructured Static Environments for Mobile Ground Robots
    Novischi, D.
    Ilas, C.
    Paturca, S.
    Ilas, M.
    CONTROL ENGINEERING AND APPLIED INFORMATICS, 2011, 13 (02): : 26 - 31
  • [8] Reinforcement Learning Approach for Navigation of Ground Robotic Platform in Statically and Dynamically Generated Environments
    Dudarenko, Dmitry
    Rubtsova, Julia
    Kovalev, Artem
    Sivchenko, Oleg
    IFAC PAPERSONLINE, 2019, 52 (25): : 445 - 450
  • [9] Reinforcement learning algorithms for robotic navigation in dynamic environments
    Yen, GG
    Hickey, TW
    ISA TRANSACTIONS, 2004, 43 (02) : 217 - 230
  • [10] Reinforcement learning algorithms for robotic navigation in dynamic environments
    Yen, G
    Hickey, T
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1444 - 1449