Counting Vehicles with Deep Learning in Onboard UAV Imagery

被引:26
|
作者
Amato, Giuseppe [1 ]
Ciampi, Luca [1 ]
Falchi, Fabrizio [1 ]
Gennaro, Claudio [1 ]
机构
[1] CNR, Inst Informat Sci & Technol, Pisa, Italy
关键词
Object Counting; Deep Learning; Convolutional Neural Networks; Onboard Embedded Processing; Real-time Vehicle Detection; Drones; UAV;
D O I
10.1109/iscc47284.2019.8969620
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The integration of mobile and ubiquitous computing with deep learning methods is a promising emerging trend that aims at moving the processing task closer to the data source rather than bringing the data to a central node. The advantages of this approach range from bandwidth reduction, high scalability, to high reliability, just to name a few. In this paper, we propose a real-time deep learning approach to automatically detect and count vehicles in videos taken from a UAV (Unmanned Aerial Vehicle). Our solution relies on a convolutional neural network based model fine-tuned to the specific domain of applications that is able to precisely localize instances of the vehicles using a regression approach, straight from image pixels to bounding box coordinates, reasoning globally about the image when making predictions and implicitly encoding contextual information. A comprehensive experimental evaluation on real-world datasets shows that our approach results in state-of-the-art performances. Furthermore, our solution achieves real-time performances by running at a speed of 4 Frames Per Second on an NVIDIA Jetson TX2 board, showing the potentiality of this approach for real-time processing in UAVs.
引用
收藏
页码:563 / 568
页数:6
相关论文
共 50 条
  • [31] Saliency Detection and Deep Learning-Based Wildfire Identification in UAV Imagery
    Zhao, Yi
    Ma, Jiale
    Li, Xiaohui
    Zhang, Jie
    SENSORS, 2018, 18 (03)
  • [32] Development of phenotyping system using low altitude UAV imagery and deep learning
    Lyu, Suxing
    Noguchi, Noboru
    Ospina, Ricardo
    Kishima, Yuji
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND BIOLOGICAL ENGINEERING, 2021, 14 (01) : 207 - 215
  • [33] Improving the mapping of coastal invasive species using UAV imagery and deep learning
    Cruz, Charmaine
    McGuinness, Kevin
    Perrin, Philip M.
    O'Connell, Jerome
    Martin, James R.
    Connolly, John
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2023, 44 (18) : 5713 - 5735
  • [34] A deep reinforcement learning-based approach to onboard trajectory generation for hypersonic vehicles
    Bao, C. Y.
    Zhou, X.
    Wang, P.
    He, R. Z.
    Tang, G. J.
    AERONAUTICAL JOURNAL, 2023, 127 (1315): : 1638 - 1658
  • [35] UAV Onboard STAR-RIS Service Enhancement Mechanism Based on Deep Reinforcement Learning
    Yan, Junjie
    Xu, Yichen
    Yuan, Haohao
    Xue, Chunhua
    SENSORS, 2025, 25 (06)
  • [36] Plot-Level Maize Early Stage Stand Counting and Spacing Detection Using Advanced Deep Learning Algorithms Based on UAV Imagery
    Wang, Biwen
    Zhou, Jing
    Costa, Martin
    Kaeppler, Shawn M.
    Zhang, Zhou
    AGRONOMY-BASEL, 2023, 13 (07):
  • [37] Cotton Seedling Detection and Counting Based on UAV Multispectral Images and Deep Learning Methods
    Feng, Yingxiang
    Chen, Wei
    Ma, Yiru
    Zhang, Ze
    Gao, Pan
    Lv, Xin
    REMOTE SENSING, 2023, 15 (10)
  • [38] AUTOMATIC ROAD CRACK RECOGNITION BASED ON DEEP LEARNING NETWORKS FROM UAV IMAGERY
    Samadzadegan, F.
    Javan, F. Dadrass
    Hasanlou, M.
    Gholamshahi, M.
    Mahini, F. Ashtari
    ISPRS GEOSPATIAL CONFERENCE 2022, JOINT 6TH SENSORS AND MODELS IN PHOTOGRAMMETRY AND REMOTE SENSING, SMPR/4TH GEOSPATIAL INFORMATION RESEARCH, GIRESEARCH CONFERENCES, VOL. 10-4, 2023, : 685 - 690
  • [39] Identification of pathogens in corn using near-infrared UAV imagery and deep learning
    Antolinez Garcia, Alfonso
    Caceres Campana, Jorge W.
    PRECISION AGRICULTURE, 2023, 24 (02) : 783 - 806
  • [40] A deep learning based approach for semantic segmentation of small fires from UAV imagery
    Saxena, Vishu
    Jain, Yash
    Mittal, Sparsh
    REMOTE SENSING LETTERS, 2025, 16 (03) : 277 - 289