Pedestrian detection in underground mines via parallel feature transfer network

被引:21
|
作者
Wei, Xing [1 ]
Zhang, Haitao [1 ]
Liu, Shaofan [1 ]
Lu, Yang [1 ]
机构
[1] Hefei Univ Technol, Hefei, Anhui, Peoples R China
基金
国家重点研发计划;
关键词
Pedestrian detection; Underground mine; Deep learning network; Parallel feature transfer; Gated unit; Unmanned driving;
D O I
10.1016/j.patcog.2020.107195
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pedestrian detection has been one of the key technologies in computer vision for autonomous driving in underground mines. However, such pedestrian detection is easily affected by complex environmental factors, such as uneven light, dense dust and cable interference. Recently, the problem of pedestrian detection is solved as an object detection task, which has achieved significant advances with the framework of deep neural networks. In this paper, we propose a novel parallel feature transfer network based detector called PftNet that achieves better efficiency than one-stage methods and maintains comparable accuracy of two-stage methods. PftNet consists of two interconnected modules, i.e., the pedestrian identification module and the pedestrian location module. The former aims to roughly adjust the location and size of the anchor box, filter out the negative anchor box, and provide better initialization for the regression. The latter enables PftNet to adapt to different scales and aspect ratios of objects and further improves the regression accuracy. Meanwhile, a feature transfer block compromising gated units is well designed to transmit the pedestrian characteristics between two modules. Extensive experiments on self-annotated underground dataset as well as INRIA and ETH datasets show that PftNet achieves state-of-the-art detection efficiency with high accuracy, which is significant to realizing unmanned driving systems in mines. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [11] Parallel Feature Network For Saliency Detection
    Fang, Zheng
    Cao, Tieyong
    Yang, Jibin
    Sun, Meng
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2019, E102A (02) : 480 - 485
  • [12] A multispectral feature fusion network for robust pedestrian detection
    Song, Xiaoru
    Gao, Song
    Chen, Chaobo
    ALEXANDRIA ENGINEERING JOURNAL, 2021, 60 (01) : 73 - 85
  • [13] A Fast Pedestrian Detection via Modified HOG Feature
    Li Weixing
    Su Haijun
    Pan Feng
    Gao Qi
    Quan Bin
    2015 34TH CHINESE CONTROL CONFERENCE (CCC), 2015, : 3870 - 3873
  • [14] Lightweight network and parallel computing for fast pedestrian detection
    Wu, Jianpeng
    Men, Yao
    Chen, DeSheng
    INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, 2021, 49 (04) : 1040 - 1049
  • [15] Pedestrian Detection under Parallel Feature Fusion Based on Choquet Integral
    Yang, Rong
    Wang, Yun
    Xu, Ying
    Qiu, Li
    Li, Qiang
    SYMMETRY-BASEL, 2021, 13 (02): : 1 - 17
  • [16] FEATURE FUSING OF FEATURE PYRAMID NETWORK FOR MULTI-SCALE PEDESTRIAN DETECTION
    Tesema, Fiseha B.
    Lin, Junpeng
    Ou, Jie
    Wu, Hong
    Zhu, William
    2018 15TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2018, : 10 - 13
  • [17] Scale-Sensitive Feature Reassembly Network for Pedestrian Detection
    Yang, Xiaoting
    Liu, Qiong
    SENSORS, 2021, 21 (12)
  • [18] Lightweight Pedestrian Detection Based on Feature Multiplexed Residual Network
    Sha, Mengzhou
    Zeng, Kai
    Tao, Zhimin
    Wang, Zhifeng
    Liu, Quanjun
    ELECTRONICS, 2023, 12 (04)
  • [19] Parallel Feature Pyramid Network for Object Detection
    Kim, Seung-Wook
    Kook, Hyong-Keun
    Sun, Jee-Young
    Kang, Mun-Cheon
    Ko, Sung-Jea
    COMPUTER VISION - ECCV 2018, PT V, 2018, 11209 : 239 - 256
  • [20] Pedestrian Detection Using Regional Proposal Network with Feature Fusion
    Lv, Xiaogang
    Zhang, Xiaotao
    Jiang, Yinghua
    Zhang, Jianxin
    2018 EIGHTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2018, : 108 - 112