Multiple Object Tracking for Occluded Particles

被引:0
|
作者
Qian, Yifei [1 ]
Ji, Ru [1 ]
Duan, Yuping [1 ]
Yang, Runhuai [1 ]
机构
[1] Anhui Med Univ, Dept Biomed Engn, Hefei 230022, Peoples R China
基金
中国国家自然科学基金;
关键词
Target tracking; Filtering algorithms; Detection algorithms; Image edge detection; Fluorescence; Uncertainty; Morphological operations; Microscopic image; multiple particles tracking; target occlusion; global data association;
D O I
10.1109/ACCESS.2020.3047099
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The precise detection and tracking of multiple particles under a microscope are of significance in the research of the individual and cluster behavior of dynamic bacteria and subcellular structures. However, the existing detection algorithms cannot separate occluded particles from each other, and most of the tracking algorithms aimed to address the occlusion involve several uncertainties. In this paper, a two-step detection algorithm based on the threshold segmentation and morphological open operation has been developed for identify non-fluorescent labeled particles under microscope, which could separate micro-contact targets. Moreover, we have proposed a novel correlation algorithm that can exploit the strengths of the global shortest path algorithm and Hungarian algorithm, which updating online in real time and considering the occlusion among particles. The proposed approach could achieve the temporal optimal match and spatial optimal solution by utilizing the multi-frame information. Moreover, the proposed method could realize the tracking of occluded particles tracking, and outperform the single global shortest path algorithm and Hungarian algorithm. The proposed method was successfully applied to six real image sequences with the maximum number of particles per frame ranging from 23 to 55, as well as a synthetic and fluorescent labeled sequence. The results of the contrast experiments demonstrated that the proposed algorithm is practical and can realize real-time tracking.
引用
收藏
页码:1524 / 1532
页数:9
相关论文
共 50 条
  • [1] Long-short Term Prediction for Occluded Multiple Object Tracking
    Sun, Zhihong
    Chen, Jun
    Mukherjee, Mithun
    Ruan, Weijian
    Liang, Chao
    Yu, Yi
    Zhang, Dan
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [2] Tracking of an occluded object in a video sequence
    Sanchez, O
    SEVENTH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOL 2, PROCEEDINGS, 2003, : 231 - 234
  • [3] Detection-Free Object Tracking for Multiple Occluded Targets in Plenoptic Video
    Yong, Yunjeong
    Kang, Jiwoo
    Oh, Heeseok
    ELECTRONICS, 2024, 13 (03)
  • [4] On the tracking of articulated and occluded video object motion
    Dockstader, SL
    Tekalp, M
    REAL-TIME IMAGING, 2001, 7 (05) : 415 - 432
  • [5] Fast occluded object tracking by a robust appearance filter
    Nguyen, HT
    Smeulders, AWM
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2004, 26 (08) : 1099 - 1104
  • [6] Robust Deformable and Occluded Object Tracking With Dynamic Graph
    Cai, Zhaowei
    Wen, Longyin
    Lei, Zhen
    Vasconcelos, Nuno
    Li, Stan Z.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2014, 23 (12) : 5497 - 5509
  • [7] Deformable and Occluded Object Tracking via Graph Learning
    Han, Wei
    Huang, Guang-Bin
    Cui, Dongshun
    2017 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING - TECHNIQUES AND APPLICATIONS (DICTA), 2017, : 376 - 383
  • [8] Tracking of Object in Occluded and Non-occluded Environment using SIFT and Kalman Filter
    Mirunalini, P.
    Jaisakthi, S. M.
    Sujana, R.
    TENCON 2017 - 2017 IEEE REGION 10 CONFERENCE, 2017, : 1290 - 1295
  • [9] Improving Multiple Object Tracking with Single Object Tracking
    Zheng, Linyu
    Tang, Ming
    Chen, Yingying
    Zhu, Guibo
    Wang, Jinqiao
    Lu, Hanqing
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 2453 - 2462
  • [10] Object Tracking by Supported Particles
    Nazib, Abdullah
    Oh, Chi-Min
    Lee, Chil-Woo
    2014 11TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2014, : 635 - 640