Object-Preserving Siamese Network for Single-Object Tracking on Point Clouds

被引:0
|
作者
Zhao, Kaijie [1 ]
Zhao, Haitao [1 ]
Wang, Zhongze [1 ]
Peng, Jingchao [1 ]
Hu, Zhengwei [1 ]
机构
[1] East China Univ Sci & Technol, Sch Informat Sci & Engn, Automat Dept, Shanghai 200237, Peoples R China
基金
中国国家自然科学基金;
关键词
Three-dimensional displays; Search problems; Feature extraction; Point cloud compression; Location awareness; Task analysis; Deep learning; Object tracking; Siamese network; point clouds; autonomous driving; deep learning;
D O I
10.1109/TMM.2023.3306490
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Undoubtedly, the object is the primary factor in 3D single-object tracking (SOT) tasks. However, prior Siamese-based trackers overlook the adverse effects resulting from randomly dropped object points during backbone sampling, hindering the prediction of accurate bounding boxes (BBoxes). Therefore, developing an approach that maximizes the preservation of object points and their object-aware features is of the utmost significance. To address this, we propose an object-preserving Siamese network (OPSNet) that can effectively maintain object integrity and boost tracking performance. First, an object<bold> </bold>highlighting module amplifies the object-aware features and extracts discriminative features from the template and search area. Next, object-preserving sampling selects object candidates, obtains object-preserving search area seeds, and discards background points that have less impact on tracking. Finally, an object localization network accurately locates 3D BBoxes based on the object-preserving search area seeds. Extensive experiments demonstrate that the performance of OPSNet exceeds the state-of-the-art performance, achieving success gains of similar to 9.4% and similar to 2.5% on the KITTI and Waymo Open datasets, respectively.
引用
收藏
页码:3007 / 3017
页数:11
相关论文
共 50 条
  • [31] Learning Dynamic Siamese Network for Visual Object Tracking
    Guo, Qing
    Feng, Wei
    Zhou, Ce
    Huang, Rui
    Wan, Liang
    Wang, Song
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 1781 - 1789
  • [32] Context-aware Siamese network for object tracking
    Zhang, Jianwei
    Wang, Jingchao
    Zhang, Huanlong
    Miao, Mengen
    Wu, Di
    [J]. IET IMAGE PROCESSING, 2023, 17 (01) : 215 - 226
  • [33] Advances in object tracking algorithm based on siamese network
    Jin G.
    Xue Y.
    Tan L.
    Xu J.
    [J]. Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2022, 44 (06): : 1805 - 1822
  • [34] Learning to Match Using Siamese Network for Object Tracking
    Li, Chaopeng
    Lu, Hong
    Jiao, Jian
    Zhang, Wenqiang
    [J]. ADVANCES IN MULTIMEDIA INFORMATION PROCESSING, PT III, 2018, 11166 : 719 - 729
  • [35] EFTrack: A Lightweight Siamese Network for Aerial Object Tracking
    Zhang, Wenqi
    Yao, Yuan
    Liu, Xincheng
    Kou, Kai
    Yang, Gang
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 3275 - 3281
  • [36] SiamMN: Siamese modulation network for visual object tracking
    Fu, Li-hua
    Ding, Yu
    Du, Yu-bin
    Zhang, Bo
    Wang, Lu-yuan
    Wang, Dan
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (43-44) : 32623 - 32641
  • [37] Visual Object Tracking by Hierarchical Attention Siamese Network
    Shen, Jianbing
    Tang, Xin
    Dong, Xingping
    Shao, Ling
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (07) : 3068 - 3080
  • [38] SiamMN: Siamese modulation network for visual object tracking
    Li-hua Fu
    Yu Ding
    Yu-bin Du
    Bo Zhang
    Lu-yuan Wang
    Dan Wang
    [J]. Multimedia Tools and Applications, 2020, 79 : 32623 - 32641
  • [39] Siamese network with transformer and saliency encoder for object tracking
    Liu, Lei
    Kong, Guangqian
    Duan, Xun
    Long, Huiyun
    Wu, Yun
    [J]. APPLIED INTELLIGENCE, 2023, 53 (02) : 2265 - 2279
  • [40] Template-Refine Network for Siamese Object Tracking
    Lu, Xiaofeng
    Li, Gaoxiang
    Yan, Zhaoyu
    Teng, Lin
    [J]. IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2024, 19 (10) : 1652 - 1660