Object pose estimation method for robotic arm grasping

被引:0
|
作者
Huang C. [1 ]
Hou S. [1 ]
机构
[1] Department of Automation, Harbin University of Science and Technology, Harbin University of Science and Technology, Heilongjiang Province, Harbin City
来源
关键词
attention mechanism; convolutional neural network; feature fusion; planar grasp; Pose estimation;
D O I
10.3233/JIFS-234351
中图分类号
学科分类号
摘要
To address the issue of target detection in the planar grasping task, a position and attitude estimation method based on YOLO-Pose is proposed. The aim is to detect the three-dimensional position of the spacecraft’s center point and the planar two-dimensional attitude in real time. First, the weight is trained through transfer learning, and the number of key points is optimized by analyzing the shape characteristics of the spacecraft to improve the representation of pose information. Second, the CBAM dual-channel attention mechanism is integrated into the C3 module of the backbone network to improve the accuracy of pose estimation. Furthermore, the Wing Loss function is used to mitigate the problem of random offset in key points. The incorporation of the bi-directional feature pyramid network (BiFPN) structure into the neck network further improves the accuracy of target detection. The experimental results show that the average accuracy value of the optimized algorithm has increased. The average detection speed can meet the speed and accuracy requirements of the actual capture task and has practical application value. © 2024 – IOS Press. All rights reserved.
引用
收藏
页码:10787 / 10803
页数:16
相关论文
共 50 条
  • [1] Pose estimation for autonomous grasping with a robotic arm system
    Chen, Chia-Hung
    Huang, Han-Pang
    JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS, 2013, 36 (05) : 638 - 646
  • [2] Pose estimation for autonomous robotic grasping
    Monroe, SE
    Rollins, JM
    Juday, RD
    HYBRID IMAGE AND SIGNAL PROCESSING VIII, 2002, 4735 : 22 - 32
  • [3] Object Pose Estimation for Robotic Grasping based on Multi-view Keypoint Detection
    Hu, Zheyuan
    Hou, Renluan
    Niu, Jianwei
    Yu, Xiaolong
    Ren, Tao
    Li, Qingfeng
    19TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2021), 2021, : 1295 - 1302
  • [4] Vision-based robotic grasping from object localization, object pose estimation to grasp estimation for parallel grippers: a review
    Du, Guoguang
    Wang, Kai
    Lian, Shiguo
    Zhao, Kaiyong
    ARTIFICIAL INTELLIGENCE REVIEW, 2021, 54 (03) : 1677 - 1734
  • [5] Vision-based robotic grasping from object localization, object pose estimation to grasp estimation for parallel grippers: a review
    Guoguang Du
    Kai Wang
    Shiguo Lian
    Kaiyong Zhao
    Artificial Intelligence Review, 2021, 54 : 1677 - 1734
  • [6] Robotic grasping method with 6D pose estimation and point cloud fusion
    Ma, Haofei
    Wang, Gongcheng
    Bai, Hua
    Xia, Zhiyu
    Wang, Weidong
    Du, Zhijiang
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2024, : 5603 - 5613
  • [7] Brain Mechanisms for Robotic Object Pose Estimation
    Chinellato, Eris
    Grzyb, Beata J.
    del Pobil, Angel P.
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 3268 - 3275
  • [8] Object Pose Estimation with Point Cloud Data for Robot Grasping
    Wu, Xingfang
    Qu, Weiming
    Zhang, Tao
    Luo, Dingsheng
    PROCEEDINGS OF 2022 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION (IEEE ICMA 2022), 2022, : 1069 - 1074
  • [9] A Practical Robotic Grasping Method by Using 6-D Pose Estimation With Protective Correction
    Zhang, Hui
    Liang, Zhicong
    Li, Chen
    Zhong, Hang
    Liu, Li
    Zhao, Chenyang
    Wang, Yaonan
    Wu, Q. M. Jonathan
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2022, 69 (04) : 3876 - 3886
  • [10] Deep instance segmentation and 6D object pose estimation in cluttered scenes for robotic autonomous grasping
    Wu, Yongxiang
    Fu, Yili
    Wang, Shuguo
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2020, 47 (04): : 593 - 606