Graph-based Pose Estimation of Texture-less Surgical Tools for Autonomous Robot Control

被引:5
|
作者
Xu, Haozheng [1 ]
Runciman, Mark [1 ]
Cartucho, Joao [1 ]
Xu, Chi [1 ]
Giannarou, Stamatia [1 ]
机构
[1] Imperial Coll London, Dept Surg & Canc, Hamlyn Ctr Robot Surg, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/ICRA48891.2023.10160287
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In Robot-assisted Minimally Invasive Surgery (RMIS), the estimation of the pose of surgical tools is crucial for applications such as surgical navigation, visual servoing, autonomous robotic task execution and augmented reality. A plethora of hardware-based and vision-based methods have been proposed in the literature. However, direct application of these methods to RMIS has significant limitations due to partial tool visibility, occlusions and changes in the surgical scene. In this work, a novel keypoint-graph-based network is proposed to estimate the pose of texture-less cylindrical surgical tools of small diameter. To deal with the challenges in RMIS, keypoint object representation is used and for the first time, temporal information is combined with spatial information in keypoint graph representation, for keypoint refinement. Finally, stable and accurate tool pose is computed using a PnP solver. Our performance evaluation study has shown that the proposed method is able to accurately predict the pose of a textureless robotic shaft with an ADD-S score of over 98%. The method outperforms state-of-the-art pose estimation models under challenging conditions such as object occlusion and changes in the lighting of the scene.
引用
收藏
页码:2731 / 2737
页数:7
相关论文
共 50 条
  • [31] Contour model based homography estimation of texture-less planar objects in uncalibrated images
    Zhang, Yueqiang
    Zhou, Langming
    Shang, Yang
    Zhang, Xiaohu
    Yu, Qifeng
    PATTERN RECOGNITION, 2016, 52 : 375 - 383
  • [32] A graph-based exploration strategy of indoor environments by an autonomous mobile robot
    Hsu, JYJ
    Hwang, LS
    1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, 1998, : 1262 - 1268
  • [33] Graph-Based Inverse Optimal Control for Robot Manipulation
    Byravan, Arunkumar
    Monfort, Mathew
    Ziebart, Brian
    Boots, Byron
    Fox, Dieter
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 1874 - 1880
  • [34] G-GOP: Generative Pose Estimation of Reflective Texture-Less Metal Parts With Global-Observation-Point Priors
    He, Zaixing
    Chao, Yue
    Wu, Mengtian
    Hu, Yilong
    Zhao, Xinyue
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2024, 29 (01) : 154 - 165
  • [35] IPPE-PCR: a novel 6D pose estimation method based on point cloud repair for texture-less and occluded industrial parts
    Qin, Wei
    Hu, Qing
    Zhuang, Zilong
    Huang, Haozhe
    Zhu, Xiaodan
    Han, Lin
    JOURNAL OF INTELLIGENT MANUFACTURING, 2023, 34 (06) : 2797 - 2807
  • [36] IPPE-PCR: a novel 6D pose estimation method based on point cloud repair for texture-less and occluded industrial parts
    Wei Qin
    Qing Hu
    Zilong Zhuang
    Haozhe Huang
    Xiaodan Zhu
    Lin Han
    Journal of Intelligent Manufacturing, 2023, 34 : 2797 - 2807
  • [37] AHPPEBot: Autonomous Robot for Tomato Harvesting based on Phenotyping and Pose Estimation
    Li, Xingxu
    Ma, Nan
    Han, Yiheng
    Yang, Shun
    Zheng, Siyi
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2024), 2024, : 18150 - 18156
  • [38] 3D Hand Pose Estimation via Graph-Based Reasoning
    Song, Jae-Hun
    Kang, Suk-Ju
    IEEE ACCESS, 2021, 9 : 35824 - 35833
  • [39] CAD-BASED VIEWPOINT ESTIMATION OF TEXTURE-LESS OBJECT FOR PURPOSIVE PERCEPTION USING DOMAIN ADAPTATION
    Gu, Changjian
    Gu, Chaochen
    Wu, Kaijie
    Zhang, Liangjun
    Guan, Xinping
    INTERNATIONAL JOURNAL OF ROBOTICS & AUTOMATION, 2019, 34 (06): : 599 - 609
  • [40] Human-to-Robot Handover Control of an Autonomous Mobile Robot Based on Hand-Masked Object Pose Estimation
    Huang, Yu-Yun
    Song, Kai-Tai
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7851 - 7858