Occlusion-related graph convolutional neural network for multi-object tracking

被引:0
|
作者
Zhang, Yubo [1 ]
Zheng, Liying [1 ]
Huang, Qingming [1 ,2 ]
机构
[1] Harbin Engn Univ, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
[2] Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100019, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-object tracking; Graph convolutional neural networks; Dense scene; Occlusion;
D O I
10.1016/j.imavis.2024.105317
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-Object Tracking (MOT) has recently been improved by Graph Convolutional Neural Networks (GCNNs) for its good performance in characterizing interactive features. However, GCNNs prefer assigning smaller proportions to node features if a node has more neighbors, presenting challenges in distinguishing objects with similar neighbors which is common in dense scenes. This paper designs an Occlusion-Related GCNN (OR-GCNN) based on which an interactive similarity module is further built. Specifically, the interactive similarity module first uses learnable weights to calculate the edge weights between tracklets and detection objects, which balances the appearance cosine similarity and Intersection over Union (IoU). Then, the module determines the proportion of node features with the help of an occlusion weight comes from a MultiLayer Perceptron (MLP). These occlusion weights, the edge weights, and the node features are then served to our OR-GCNN to obtain interactive features. Finally, by integrating interactive similarity into a common MOT framework, such as BoT-SORT, one gets a tracker that efficiently alleviates the issues in dense MOT task. The experimental results on MOT16 and MOT17 benchmarks show that our model achieves the MOTA of 80.6 and 81.1 and HOTA of 65.3 and 65.1 on MOT16 and MOT17, respectively, which outperforms the state-of-the-art trackers, including ByteTrack, BoTSORT, GCNNMatch, GNMOT, and GSM.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Multi-object trajectory tracking
    Han, Mei
    Xu, Wei
    Tao, Hai
    Gong, Yihong
    MACHINE VISION AND APPLICATIONS, 2007, 18 (3-4) : 221 - 232
  • [42] Multi-Object Tracking and Segmentation Via Neural Message Passing
    Guillem Brasó
    Orcun Cetintas
    Laura Leal-Taixé
    International Journal of Computer Vision, 2022, 130 : 3035 - 3053
  • [43] Multi-object tracking in video
    Agbinya, JI
    Rees, D
    REAL-TIME IMAGING, 1999, 5 (05) : 295 - 304
  • [44] Multi-object Tracking with Neural Gating Using Bilinear LSTM
    Kim, Chanho
    Li, Fuxin
    Rehg, James M.
    COMPUTER VISION - ECCV 2018, PT VIII, 2018, 11212 : 208 - 224
  • [45] Referring Multi-Object Tracking
    Wu, Dongming
    Han, Wencheng
    Wang, Tiancai
    Dong, Xingping
    Zhang, Xiangyu
    Shen, Jianbing
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 14633 - 14642
  • [46] Multi-Object Tracking in the Dark
    Wang, Xinzhe
    Ma, Kang
    Liu, Qiankun
    Zou, Yunhao
    Fu, Ying
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2024, 2024, : 382 - 392
  • [47] Multi-Object Tracking and Segmentation Via Neural Message Passing
    Braso, Guillem
    Cetintas, Orcun
    Leal-Taixe, Laura
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2022, 130 (12) : 3035 - 3053
  • [48] Detection Recovery in Online Multi-Object Tracking with Sparse Graph Tracker
    Hyun, Jeongseok
    Kang, Myunggu
    Wee, Dongyoon
    Yeung, Dit-Yan
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 4839 - 4848
  • [49] Multi-object trajectory tracking
    Mei Han
    Wei Xu
    Hai Tao
    Yihong Gong
    Machine Vision and Applications, 2007, 18 : 221 - 232
  • [50] GATE CONNECTED CONVOLUTIONAL NEURAL NETWORK FOR OBJECT TRACKING
    Kokul, T.
    Fookes, C.
    Sridharan, S.
    Ramanan, A.
    Pinidiyaarachchi, U. A. J.
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 2602 - 2606