Self-Initialization and Recovery for Uninterrupted Tracking in Vision-Guided Micromanipulation

被引:0
|
作者
Yang, Liangjing [1 ]
Paranawithana, Ishara [2 ]
Youcef-Toumi, Kamal [1 ]
Tan, U-Xuan [2 ]
机构
[1] MIT, Dept Mech Engn, Cambridge, MA 02139 USA
[2] Singapore Univ Technol & Design, Pillar Engn Prod Dev, Singapore, Singapore
关键词
CALIBRATION; MICROSCOPE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a workflow algorithm for timely tracking of the tool tip during cell manipulation using a template-based approach augmented with low level feature detection. Doing so addresses the problem of adverse influences on template-based tracking during tool-cell interaction while maintaining an efficient track-servo framework. This consideration is important in developing autonomous robotic vision-guided micromanipulators. Our method facilitates vision-guided micromanipulation autonomously without manual interventions even during tool-cell interaction. This is done by decomposing the process to four scenarios that operate on their respective mode. The self-initializing mode is first used to localize and focus a region of interest (ROI) which the tip lies in. Once in focus, the tip is manipulated using a unified visual track-servo template-based approach. A reinitialization mechanism will be triggered to prevent tracking from being interrupted by partial cell occlusion of the tracking ROI. This mechanism uses the self-initializing concept combining motion cue and low-level feature detection to localize the needle tip. Following the reinitialization, we further recover tracking of the needle tip using a mechanism that updates the base template. This adaptive approach ensures uninterrupted tracking even when the cell is interacting with the tool and under deformation. Results demonstrated that with the newly incorporated mechanisms, the localized position improved from an error of more than 50% to less than 10% of the specimen size. When there is no specimen in the scene the new workflow shows no adverse effect on the localization through 270 tracked frames. By incorporating reinitialization and recovery to this workflow algorithm, we hope to initiate the first step towards uncalibrated autonomous vision-guided micromanipulation process.
引用
收藏
页码:1127 / 1133
页数:7
相关论文
共 23 条
  • [1] Self-calibration method for vision-guided cell micromanipulation systems
    Zhang, Y. L.
    Han, M. L.
    Shee, C. Y.
    Chia, T. F.
    Ang, W. T.
    JOURNAL OF MICROSCOPY, 2009, 233 (02) : 340 - 345
  • [2] Vision-guided micromanipulation system for biomedical application
    Shim, JH
    Cho, SY
    Cha, DH
    OPTOMECHATRONIC MICRO/NANO COMPONENTS, DEVICES, AND SYSTEMS, 2004, 5604 : 98 - 107
  • [3] Scene-Adaptive Fusion of Visual and Motion Tracking for Vision-Guided Micromanipulation in Plant Cells
    Paranawithana, Ishara
    Yang, Liangjing
    Chen, Zhong
    Youcef-Toumi, Kamal
    Tan, U-Xuan
    2018 IEEE 14TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2018, : 1434 - 1440
  • [4] Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup
    Yang, Liangjing
    Paranawithana, Ishara
    Youcef-Toumi, Kamal
    Tan, U-Xuan
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2018, 15 (04) : 1609 - 1620
  • [5] Confidence-Based Hybrid Tracking to Overcome Visual Tracking Failures in Calibration-Less Vision-Guided Micromanipulation
    Yang, Liangjing
    Paranawithana, Ishara
    Youcef-Toumi, Kamal
    Tan, U-Xuan
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2020, 17 (01) : 524 - 536
  • [6] Automatic Hysteresis Modeling of Piezoelectric Micromanipulator in Vision-Guided Micromanipulation Systems
    Zhang, Yan Liang
    Han, Ming Li
    Yu, Meng Yin
    Shee, Cheng Yap
    Ang, Wei Tech
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2012, 17 (03) : 547 - 553
  • [7] A decomposed control scheme for vision-guided manipulators curve tracking
    Ping, J
    Chen, HT
    Wang, YJ
    Lin, J
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 1999, 46 (03) : 667 - 669
  • [8] Dynamic composition of tracking primitives for interactive vision-guided navigation
    Burschka, D
    Hager, G
    MOBILE ROBOTS XVI, 2002, 4573 : 114 - 125
  • [9] Vision-Guided Tracking and Emergency Landing for UAVs on Moving Targets
    Debele, Yisak
    Shi, Ha-Young
    Wondosen, Assefinew
    Warku, Henok
    Ku, Tae-Wan
    Kang, Beom-Soo
    DRONES, 2024, 8 (05)
  • [10] A vision-guided object tracking and prediction algorithm for soccer robots
    Hong, CS
    Chun, SM
    Lee, JS
    Hong, KS
    1997 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION - PROCEEDINGS, VOLS 1-4, 1997, : 346 - 351