Pose Estimation of Robot End-Effector using a CNN-Based Cascade Estimator

被引:0
|
作者
Ortega, Kevin D. [1 ]
Sepulveda, Jorge I. [1 ]
Hernandez, Byron [2 ]
Holguin, German A. [1 ,3 ]
Medeiros, Henry [2 ]
机构
[1] Univ Tecnol Pereria, Dept Elect Engn, Pereira, Colombia
[2] Univ Florida, Dept Ag & Bio Engn, Gainesville, FL USA
[3] Marquette Univ, Dept Elect & Comp Engn, Milwaukee, WI 53233 USA
关键词
Computer vision; Pose Estimation; Robotic Manipulation; Neural Networks; Industry; 4.0;
D O I
10.1109/CCAC58200.2023.10333441
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Robotics has become an essential component of virtually every industry sector, including modern assembly, manufacturing, agricultural processes, and even retail operations. Ensuring the safety of human collaborators working alongside robots is of utmost importance, and accurate estimation of the robot's end-effector pose is critical for achieving this goal. In this paper, we present a method to estimate the end-effector pose of an industrial serial manipulator without relying on the robot's encoders. Our method uses depth cameras in the robot workspace, feeding a convolutional neural network and a cascade estimator to determine the 3D coordinates of every joint sequentially. We studied three variations of the method, with the first estimating all joints independently, the second estimating every joint based on the estimation of the previous joint, and the third using all the previous joints. Our experimental setup employed a UR5 6DOF robot arm in the ROS-Gazebo simulation ecosystem. The results show that the third variation exhibits the best performance, demonstrating the effectiveness of this methodology in predicting the pose of a manipulator using only computer vision. This approach is an enabling technology for many Industry 4.0 applications, improving both human safety and manufacturing efficiency.
引用
收藏
页码:85 / 90
页数:6
相关论文
共 50 条
  • [1] Single Image based Camera Calibration and Pose Estimation of the End-effector of a Robot
    Boby, R. A.
    Saha, S. K.
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2016, : 2435 - 2440
  • [2] Real-Time Robot End-Effector Pose Estimation with Deep Network
    Cheng, Hu
    Wang, Yingying
    Meng, Max Q-H
    [J]. 2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 10921 - 10926
  • [3] Pose Planning for the End-effector of Robot in the Welding of Intersecting Pipes
    Liu Yu
    Zhao Jing
    Lu Zhenyang
    Chen Shujun
    [J]. CHINESE JOURNAL OF MECHANICAL ENGINEERING, 2011, 24 (02) : 264 - 270
  • [5] Modeling of an external force estimator for an end-effector of a robot by neural networks
    Junplod, Goragod
    Kornmaneesang, Woraphrut
    Chen, Shyh-Leh
    Wongsa, Sarawan
    [J]. JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS, 2023, 46 (08) : 895 - 904
  • [6] Set-point control of robot end-effector pose using dual quaternion feedback
    Pham, H. L.
    Adorno, B., V
    Perdereau, V
    Fraisse, P.
    [J]. ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2018, 52 : 100 - 110
  • [7] Pose estimation method for a simultaneous three-fingered end-effector
    Fan S.
    Wu J.
    Jin M.
    Fan C.
    Liu H.
    [J]. Harbin Gongcheng Daxue Xuebao/Journal of Harbin Engineering University, 2019, 40 (02): : 359 - 364
  • [8] Robot End-Effector Mounted Camera Pose Optimization in Object Detection-Based Tasks
    Loris Roveda
    Marco Maroni
    Lorenzo Mazzuchelli
    Loris Praolini
    Asad Ali Shahid
    Giuseppe Bucca
    Dario Piga
    [J]. Journal of Intelligent & Robotic Systems, 2022, 104
  • [9] Robot End-Effector Mounted Camera Pose Optimization in Object Detection-Based Tasks
    Roveda, Loris
    Maroni, Marco
    Mazzuchelli, Lorenzo
    Praolini, Loris
    Shahid, Asad Ali
    Bucca, Giuseppe
    Piga, Dario
    [J]. JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2022, 104 (01)
  • [10] Handling Object Symmetries in CNN-based Pose Estimation
    Richter-Klug, Jesse
    Frese, Udo
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 13850 - 13856