Towards Real-Time On-Drone Pedestrian Tracking in 4K Inputs

被引:2
|
作者
Oh, Chanyoung [1 ]
Lee, Moonsoo [2 ]
Lim, Chaedeok [2 ]
机构
[1] Kongju Natl Univ, Dept Software, Cheonan 31080, South Korea
[2] Elect & Telecommun Res Inst, Air Mobil Res Div, Daejeon 34129, South Korea
基金
新加坡国家研究基金会;
关键词
on-device deep learning; unmanned aerial vehicle (UAV); drone; real-time object tracking;
D O I
10.3390/drones7100623
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Over the past several years, significant progress has been made in object tracking, but challenges persist in tracking objects in high-resolution images captured from drones. Such images usually contain very tiny objects, and the movement of the drone causes rapid changes in the scene. In addition, the computing power of mission computers on drones is often insufficient to achieve real-time processing of deep learning-based object tracking. This paper presents a real-time on-drone pedestrian tracker that takes as the input 4K aerial images. The proposed tracker effectively hides the long latency required for deep learning-based detection (e.g., YOLO) by exploiting both the CPU and GPU equipped in the mission computer. We also propose techniques to minimize detection loss in drone-captured images, including a tracker-assisted confidence boosting and an ensemble for identity association. In our experiments, using real-world inputs captured by drones at a height of 50 m, the proposed method with an NVIDIA Jetson TX2 proves its efficacy by achieving real-time detection and tracking in 4K video streams.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Occlusion Robust Object Detection and Tracking on a Real-time Drone
    Kim, Taeyeon
    Wee, Inhwan
    Shim, David Hyunchul
    2019 19TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2019), 2019, : 1627 - 1631
  • [22] 4K4D: Real-Time 4D View Synthesis at 4K Resolution
    Xu, Zhen
    Peng, Sida
    Lin, Haotong
    He, Guangzhao
    Sun, Jiaming
    Shen, Yujun
    Bao, Hujun
    Zhou, Xiaowei
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 20029 - 20040
  • [23] Automatic pedestrian detection and tracking for real-time video surveillance
    Yang, HD
    Sin, BK
    Lee, SW
    AUDIO-BASED AND VIDEO-BASED BIOMETRIC PERSON AUTHENTICATION, PROCEEDINGS, 2003, 2688 : 242 - 250
  • [24] AdaPT: Real-time Adaptive Pedestrian Tracking for crowded scenes
    Bera, Aniket
    Galoppo, Nico
    Sharlet, Dillon
    Lake, Adam
    Manocha, Dinesh
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 1801 - 1808
  • [25] A Real-Time Pedestrian Tracking Method Based on Enhanced Tracktor
    Song, Huawei
    Li, Jian
    Wan, Fangjie
    Hong, Jiahao
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON ELECTRONIC INFORMATION TECHNOLOGY AND COMPUTER ENGINEERING, EITCE 2023, 2023, : 812 - 817
  • [26] Real-Time Infrastructureless Indoor Tracking for Pedestrian Using a Smartphone
    Yang, Zhe
    Pan, Yun
    Tian, Qinglin
    Huan, Ruohong
    IEEE SENSORS JOURNAL, 2019, 19 (22) : 10782 - 10795
  • [27] Real-Time Multiple Pedestrian Tracking Based on Object Identification
    Kim, Dohun
    Kim, Heegwang
    Shin, Jungsup
    Mok, Yeongheon
    Paik, Joonki
    2019 IEEE 9TH INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE-BERLIN), 2019, : 363 - 365
  • [28] Towards reliable real-time multiview tracking
    Li, Y
    Hilton, A
    Illingworth, J
    2001 IEEE WORKSHOP ON MULTI-OBJECT TRACKING, PROCEEDINGS, 2001, : 43 - 50
  • [29] FPGA IP for Real-time 4K HDR Image Decoding in VR Devices
    Kamalavasan, Kamlakannan
    Natheesan, Ratnasegar
    Pradeep, Kathirgamaraja
    Gowthaman, Sivakumaran
    Aravinth, Sivakaneshan
    Pasqual, Ajith
    2019 IEEE 10TH LATIN AMERICAN SYMPOSIUM ON CIRCUITS & SYSTEMS (LASCAS), 2019, : 101 - 104
  • [30] 4K Real-Time and Parallel Software Video Decoder for Multilayer HEVC Extensions
    Hamidouche, Wassim
    Raulet, Mickael
    Deforges, Olivier
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2016, 26 (01) : 169 - 180