Vehicle and pedestrian video-tracking with classification based on deep convolutional neural networks

被引:4
|
作者
Forero, Alejandro [1 ]
Calderon, Francisco [2 ]
机构
[1] Secretaria Dist Movilidad, Bogota, Colombia
[2] Pontificia Univ Javeriana, Secretaria Dist Movilidad, Bogota, Colombia
关键词
image processing; video object tracking; video-tracking; Object detection; vehicle counting;
D O I
10.1109/stsiva.2019.8730234
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this article we propose an algorithm for the classification, tracking and counting of vehicles and pedestrians in video sequences; The algorithm is divided into two parts, a classification algorithm, which is based on convolutional neural networks, implemented using the You Only Look Once (YOLO) method; and a proposed algorithm for tracking regions of interest based in a well defined taxonomy. For the first stage of classification, We train and evaluate the performance with a set of more than 50000 labels, which we make available for their use. The tracking algorithm is evaluated against manual counts in video sequences of different scenarios captured in the management center of the Secretaria distrital de Movilidad of Bogota.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Multispectral pedestrian detection based on deep convolutional neural networks
    Hou, Ya-Li
    Song, Yaoyao
    Hao, Xiaoli
    Shen, Yan
    Qian, Manyi
    Chen, Houjin
    INFRARED PHYSICS & TECHNOLOGY, 2018, 94 : 69 - 77
  • [2] Multispectral Pedestrian Detection Based on Deep Convolutional Neural Networks
    Hou, Ya-Li
    Song, Yaoyao
    Tao, Xiaoli
    Shen, Yan
    Qian, Manyi
    2017 IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, COMMUNICATIONS AND COMPUTING (ICSPCC), 2017,
  • [3] Combining Very Deep Convolutional Neural Networks and Recurrent Neural Networks for Video Classification
    Kiziltepe, Rukiye Savran
    Gan, John Q.
    Escobar, Juan Jose
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2019, PT II, 2019, 11507 : 811 - 822
  • [4] Deep Convolutional Neural Networks for pedestrian detection
    Tome, D.
    Monti, F.
    Baroffio, L.
    Bondi, L.
    Tagliasacchi, M.
    Tubaro, S.
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2016, 47 : 482 - 489
  • [5] VEHICLE CLASSIFICATION BASED ON DEEP CONVOLUTIONAL NEURAL NETWORKS MODEL FOR TRAFFIC SURVEILLANCE SYSTEMS
    Zakria
    Cai, Jingye
    Deng, Jianhua
    Khokhar, Muhammad Saddam
    Aftab, Muhammad Umar
    2018 15TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2018, : 224 - 227
  • [6] Categorical Vehicle Classification and Tracking using Deep Neural Networks
    Sharma, Deependra
    Jaffery, Zainul Abdin
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (09) : 564 - 574
  • [7] Pedestrian Tracking Algorithm for Video Surveillance Based on Lightweight Convolutional Neural Network
    Wei, Honglei
    Zhai, Xianyi
    Wu, Hongda
    IEEE ACCESS, 2024, 12 : 24831 - 24842
  • [8] Vehicle-Pedestrian Classification with Road Context Recognition Using Convolutional Neural Networks
    Billones, Robert Kerwin C.
    Bandala, Argel A.
    Gan Lim, Laurence A.
    Culaba, Alvin B.
    Vicerra, Ryan Rhay P.
    Sybingco, Edwin
    Fillone, Alexis M.
    Dadios, Elmer P.
    2018 IEEE 10TH INTERNATIONAL CONFERENCE ON HUMANOID, NANOTECHNOLOGY, INFORMATION TECHNOLOGY, COMMUNICATION AND CONTROL, ENVIRONMENT AND MANAGEMENT (HNICEM), 2018,
  • [9] Classification of vehicle types using fused deep convolutional neural networks
    Qian, Zichen
    Zhao, Chihang
    Zhang, Bailing
    Lin, Shengmei
    Hua, Liru
    Li, Hao
    Ma, Xiaogang
    Ma, Teng
    Wang, Xinliang
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (06) : 5125 - 5137
  • [10] VEHICLE ACCIDENT AND TRAFFIC CLASSIFICATION USING DEEP CONVOLUTIONAL NEURAL NETWORKS
    Kumeda, Bulbula
    Zhang Fengli
    Oluwasanmi, Ariyo
    Owusu, Forster
    Assefa, Maregu
    Amenu, Temesgen
    2019 16TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICWAMTIP), 2019, : 323 - 328