Rethinking Bipartite Graph Matching in Realtime Multi-object Tracking

被引:0
|
作者
Zou, Zhuojun [1 ,2 ]
Hao, Jie [1 ,3 ]
Shu, Lin [1 ,3 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Articial Intelligence, Beijing, Peoples R China
[3] Guangdong Inst Articial Intelligence & Adv Comp, Guangzhou, Peoples R China
关键词
Bipartite Graph Matching; Multi-object Tracking; Tracking-by-detection;
D O I
10.1109/CACML55074.2022.00124
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Data association is a crucial part for tracking-by-detection framework. Although many works about constructing the matching cost between trajectories and detections have been proposed in the community, few researchers pay attention to how to improve the efficiency of bipartite graph matching in realtime multi-object tracking. In this paper, we start with the optimal solution of integer linear programming, explore the best application of bipartite graph matching in tracking task and evaluate the rationality of cost matrix simultaneously. Frist, we analyze the defects of bipartite graph matching process in some multi-object tracking methods, and establish a criteria of similarity measure between trajectories and detections. Then we design two weight matrices for multi-object tracking by applying our criteria. Besides, a novel tracking process is proposed to handle visual-information-free scenario. Our method improves the accuracy of the graph-matchingbased approach at very fast running speed (3000+ FPS). Comprehensive experiments performed on MOT benchmarks demonstrate that the proposed approach achieves the stateof-the-art performance in methods without visual information. Moreover, the efficient matching process can also be assembled on approaches with appearance information to replace cascade matching.
引用
收藏
页码:713 / 718
页数:6
相关论文
共 50 条
  • [41] GTAN: graph-based tracklet association network for multi-object tracking
    Lv Jianfeng
    Yu Zhongliang
    Liu Yifan
    Sun Guanghui
    [J]. Neural Computing and Applications, 2024, 36 : 3889 - 3902
  • [42] MULTI-OBJECT TRACKING AS ATTENTION MECHANISM
    Fukui, Hiroshi
    Miyagawa, Taiki
    Morishita, Yusuke
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 505 - 509
  • [43] Multi-object tracking of human spermatozoa
    Sorensen, Lauge
    Ostergaard, Jakob
    Johansen, Peter
    de Bruijne, Marleen
    [J]. MEDICAL IMAGING 2008: IMAGE PROCESSING, PTS 1-3, 2008, 6914
  • [44] MOTS: Multi-Object Tracking and Segmentation
    Voigtlaender, Paul
    Krause, Michael
    Osep, Aljosa
    Luiten, Jonathon
    Sekar, Berin Balachandar Gnana
    Geiger, Andreas
    Leibe, Bastian
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 7934 - 7943
  • [45] TrackFormer: Multi-Object Tracking with Transformers
    Meinhardt, Tim
    Kirillov, Alexander
    Leal-Taixe, Laura
    Feichtenhofer, Christoph
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 8834 - 8844
  • [46] Interacting Tracklets for Multi-Object Tracking
    Lan, Long
    Wang, Xinchao
    Zhang, Shiliang
    Tao, Dacheng
    Gao, Wen
    Huang, Thomas S.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (09) : 4585 - 4597
  • [47] Rethinking Multi-Object Tracking Based on Re-Identification and Appearance Model Management
    Cho, Yeong-Jun
    Kim, Dohyung
    [J]. IEEE ACCESS, 2023, 11 : 54337 - 54351
  • [48] Engineering statistics for multi-object tracking
    Mahler, R
    [J]. 2001 IEEE WORKSHOP ON MULTI-OBJECT TRACKING, PROCEEDINGS, 2001, : 53 - 60
  • [49] MeMOT: Multi-Object Tracking with Memory
    Cai, Jiarui
    Xu, Mingze
    Li, Wei
    Xiong, Yuanjun
    Xia, Wei
    Tu, Zhuowen
    Soatto, Stefano
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 8080 - 8090
  • [50] Multi-object tracking for horse racing
    Ng, Wing W. Y.
    Liu, Xuyu
    Yan, Xuli
    Tian, Xing
    Zhong, Cankun
    Kwong, Sam
    [J]. INFORMATION SCIENCES, 2023, 638