Real-time deep learning approach to visual servo control and grasp detection for autonomous robotic manipulation

被引:34
|
作者
Ribeiro, Eduardo Godinho [1 ]
Mendes, Raul de Queiroz [1 ]
Grassi Jr, Valdir [1 ]
机构
[1] Univ Sao Paulo, Sao Carlos Sch Engn, Dept Elect & Comp Engn, Sao Paulo, Brazil
基金
巴西圣保罗研究基金会;
关键词
Robotic grasping; Visual servoing; Real-time; Deep learning; 7DoF robot; CAMERA;
D O I
10.1016/j.robot.2021.103757
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Robots still cannot perform everyday manipulation tasks, such as grasping, with the same dexterity as humans do. In order to explore the potential of supervised deep learning for robotic grasping in unstructured and dynamic environments, this work addresses the visual perception phase involved in the task. This phase involves the processing of visual data to obtain the location of the object to be grasped, its pose and the points at which the robot's grippers must make contact to ensure a stable grasp. For this, the Cornell Grasping Dataset (CGD) is used to train a Convolutional Neural Network (CNN) that is able to consider these three stages simultaneously. In other words, having an image of the robot's workspace, containing a certain object, the network predicts a grasp rectangle that symbolizes the position, orientation and opening of the robot's parallel grippers the instant before its closing. In addition to this network, which runs in real-time, another network is designed, so that it is possible to deal with situations in which the object moves in the environment. Therefore, the second convolutional network is trained to perform a visual servo control, ensuring that the object remains in the robot's field of view. This network predicts the proportional values of the linear and angular velocities that the camera must have to ensure the object is in the image processed by the grasp network. The dataset used for training was automatically generated by a Kinova Gen3 robotic manipulator with seven Degrees of Freedom (DoF). The robot is also used to evaluate the applicability in real-time and obtain practical results from the designed algorithms. Moreover, the offline results obtained through test sets are also analyzed and discussed regarding their efficiency and processing speed. The developed controller is able to achieve a millimeter accuracy in the final position considering a target object seen for the first time. To the best of our knowledge, we have not found in the literature other works that achieve such precision with a controller learned from scratch. Thus, this work presents a new system for autonomous robotic manipulation, with the ability to generalize to different objects and with high processing speed, which allows its application in real robotic systems. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Autonomous Robotic Manipulation: Real-Time, Deep-Learning Approach for Grasping of Unknown Objects
    Sayour, Malak H.
    Kozhaya, Sharbel E.
    Saab, Samer S.
    [J]. JOURNAL OF ROBOTICS, 2022, 2022
  • [2] Deep vision networks for real-time robotic grasp detection
    Guo, Di
    Sun, Fuchun
    Kong, Tao
    Liu, Huaping
    [J]. INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2017, 14 (01):
  • [3] Visual Manipulation Relationship Detection with Fully Connected CRFs for Autonomous Robotic Grasp
    Yang, Chenjie
    Lan, Xuguang
    Zhang, Hanbo
    Zhou, Xinwen
    Zheng, Nanning
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2018, : 393 - 400
  • [4] Deep Learning for Real-Time Neural Decoding of Grasp
    Viviani, Paolo
    Gesmundo, Ilaria
    Ghinato, Elios
    Agudelo-Toro, Andres
    Vercellino, Chiara
    Vitali, Giacomo
    Bergamasco, Letizia
    Scionti, Alberto
    Ghislieri, Marco
    Agostini, Valentina
    Terzo, Olivier
    Scherberger, Hansjoerg
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 379 - 393
  • [5] A real-time robotic grasp approach with oriented anchor box
    Zhang, Hanbo
    Zhou, Xinwen
    Lan, Xuguang
    Li, Jin
    Tian, Zhiqiang
    Zheng, Nanning
    [J]. arXiv, 2018,
  • [6] Deep Learning Based, Real-Time Object Detection for Autonomous Driving
    Akyol, Gamze
    Kantarci, Alperen
    Celik, Ali Eren
    Ak, Abdullah Cihan
    [J]. 2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [7] Real-Time Acoustic Holography With Physics-Based Deep Learning for Robotic Manipulation
    Zhong, Chengxi
    Li, Jiaqi
    Sun, Zhenhuan
    Li, Teng
    Guo, Yao
    Jeong, David C.
    Su, Hu
    Liu, Song
    [J]. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2023, 21 (03) : 1 - 10
  • [8] Autonomous Microrobotic Manipulation Using Visual Servo Control
    Feemster, Matthew
    Piepmeier, Jenelle A.
    Biggs, Harrison
    Yee, Steven
    ElBidweihy, Hatem
    Firebaugh, Samara L.
    [J]. MICROMACHINES, 2020, 11 (02)
  • [9] Light-weight algorithm for real-time robotic grasp detection
    Song, Mingjun
    Yan, Wen
    Deng, Yizhao
    Zhang, Junran
    Tu, Haiyan
    [J]. Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2024, 58 (03): : 599 - 610
  • [10] Fast Convolutional Neural Network for Real-Time Robotic Grasp Detection
    Ribeiro, Eduardo G.
    Grassi Jr, Valdir
    [J]. 2019 19TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS (ICAR), 2019, : 49 - 54