Real-time 3D work-piece tracking with monocular camera based on static and dynamic model libraries

被引:4
|
作者
Zhu, Wenjun [1 ]
Wang, Peng [1 ]
Li, Rui [1 ]
Nie, Xiangli [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
3D tracking; Automation assembly; EFFICIENT;
D O I
10.1108/AA-02-2017-018
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Purpose - This paper aims to propose a novel real-time three-dimensional (3D) model-based work-piece tracking method with monocular camera for high-precision assembly. Tracking of 3D work-pieces with real-time speed is becoming more and more important for some industrial tasks, such as work-pieces grasping and assembly, especially in complex environment. Design/methodology/approach - A three-step process method was provided, i.e. the offline static global library generation process, the online dynamic local library updating and selection process and the 3D work-piece localization process. In the offline static global library generation process, the computer-aided design models of the work-piece are used to generate a set of discrete two-dimensional (2D) hierarchical views matching libraries. In the online dynamic library updating and selection process, the previous 3D location information of the work-piece is used to predict the following location range, and a discrete matching library with a small number of 2D hierarchical views is selected from dynamic local library for localization. Then, the work-piece is localized with high-precision and real-time speed in the 3D work-piece localization process. Findings - The method is suitable for the texture-less work-pieces in industrial applications. Originality/value - The small range of the library enables a real-time matching. Experimental results demonstrate the high accuracy and high efficiency of the proposed method.
引用
收藏
页码:219 / 229
页数:11
相关论文
共 50 条
  • [1] Real-time 3D Model-based Tracking of Work-piece with Monocular Camera
    Zhu, Wenjun
    Wang, Peng
    Li, Fudong
    Su, Jianhua
    Qiao, Hong
    [J]. 2015 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2015, : 777 - 782
  • [2] Real-Time 3D Pedestrian Tracking with Monocular Camera
    Xiao, Peng
    Yan, Fei
    Chi, Jiannan
    Wang, Zhiliang
    [J]. WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2022, 2022
  • [3] Real Time 3D Facial Movement Tracking Using a Monocular Camera
    Dong, Yanchao
    Wang, Yanming
    Yue, Jiguang
    Hu, Zhencheng
    [J]. SENSORS, 2016, 16 (08)
  • [4] Real-time Monocular 3D People Localization and Tracking on Embedded System
    Zhu, Yipeng
    Wang, Tao
    Zhu, Shiqiang
    [J]. 2021 6TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM 2021), 2021, : 797 - 802
  • [5] REAL-TIME TRACKING WITH AN EMBEDDED 3D CAMERA WITH FPGA PROCESSING
    Muscoloni, Alessandro
    Mattoccia, Stefano
    [J]. 2014 INTERNATIONAL CONFERENCE ON 3D IMAGING (IC3D), 2014,
  • [6] Robust, Real-Time 3D Face Tracking from a Monocular View
    Liao, Wei-Kai
    Fidaleo, Douglas
    Medioni, Gerard
    [J]. EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING, 2010,
  • [7] Robust, Real-Time 3D Face Tracking from a Monocular View
    Wei-Kai Liao
    Douglas Fidaleo
    Gerard Medioni
    [J]. EURASIP Journal on Image and Video Processing, 2010
  • [8] Real-Time 3D Reconstruction Method Based on Monocular Vision
    Jia, Qingyu
    Chang, Liang
    Qiang, Baohua
    Zhang, Shihao
    Xie, Wu
    Yang, Xianyi
    Sun, Yangchang
    Yang, Minghao
    [J]. SENSORS, 2021, 21 (17)
  • [9] Real-Time 3D Object Detection and Tracking in Monocular Images of Cluttered Environment
    Du, Guoguang
    Wang, Kai
    Nan, Yibing
    Lian, Shiguo
    [J]. IMAGE AND GRAPHICS, ICIG 2019, PT II, 2019, 11902 : 119 - 130
  • [10] GANerated Hands for Real-Time 3D Hand Tracking from Monocular RGB
    Mueller, Franziska
    Bernard, Florian
    Sotnychenko, Oleksandr
    Mehta, Dushyant
    Sridhar, Srinath
    Casas, Dan
    Theobalt, Christian
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 49 - 59