ALODAD: An Anchor-Free Lightweight Object Detector for Autonomous Driving

被引:17
|
作者
Liang, Tianjiao
Bao, Hong
Pan, Weiguo [1 ]
Pan, Feng
机构
[1] Beijing Union Univ, Beijing Key Lab Informat Serv Engn, Beijing 100101, Peoples R China
来源
IEEE ACCESS | 2022年 / 10卷
基金
中国国家自然科学基金;
关键词
Feature extraction; Object detection; Convolution; Autonomous vehicles; Computational modeling; Location awareness; Neural networks; Autonomous driving; deep learning; lightweight; object detection; NEURAL-NETWORK;
D O I
10.1109/ACCESS.2022.3166923
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Vision-based object detection is an essential component of autonomous driving. Because vehicles typically have limited on-board computing resources, a small-sized detection model is required. Simultaneously, high object detection accuracy and real-time inference detection speeds are required to ensure safety while driving. In this paper, an anchor-free lightweight object detector for autonomous driving called ALODAD is proposed. ALODAD incorporates an attention scheme into the lightweight neural network GhostNet and builds an anchor-free detection framework to achieve lower computational costs and provide parameters with high detection accuracy. Specifically, the lightweight backbone neural network integrates a convolutional block attention model that analyzes the valuable features from traffic scene images to generate an accurate bounding box, and then constructs feature pyramids for multi-scale object detection. The proposed method adds an intersection over union (IoU) branch to the decoupled detector to rank the vast number of candidate detections accurately. To increase the data diversity, data augmentation was used during training. Extensive experiments based on benchmarks demonstrate that the proposed method offers improved performance compared to the baseline. The proposed method can achieve an increased detection accuracy while meeting the real-time requirements of autonomous driving. The proposed method was compared with the YOLOv5 and RetinaNet models and 98.7% and 94.5% were obtained for the average precision metrics AP50 and AP75, respectively, on the BCTSDB dataset.
引用
收藏
页码:40701 / 40714
页数:14
相关论文
共 50 条
  • [1] AspectNet: Aspect-Aware Anchor-Free Detector for Autonomous Driving
    Liang, Tianjiao
    Bao, Hong
    Pan, Weiguo
    Fan, Xinyue
    Li, Han
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (12):
  • [2] L4Net: An anchor-free generic object detector with attention mechanism for autonomous driving
    Wu, Yanan
    Feng, Songhe
    Huang, Xiankai
    Wu, Zizhang
    [J]. IET COMPUTER VISION, 2021, 15 (01) : 36 - 46
  • [3] An Anchor-Free Lightweight Object Detection Network
    Wang, Weina
    Gou, Yunyan
    [J]. IEEE ACCESS, 2023, 11 : 110361 - 110374
  • [4] Anchor-Free Object Detection with Scale-Aware Networks for Autonomous Driving
    Piao, Zhengquan
    Wang, Junbo
    Tang, Linbo
    Zhao, Baojun
    Zhou, Shichao
    [J]. ELECTRONICS, 2022, 11 (20)
  • [5] A fully convolutional anchor-free object detector
    Zhang, Taoshan
    Li, Zheng
    Sun, Zhikuan
    Zhu, Lin
    [J]. VISUAL COMPUTER, 2023, 39 (02): : 569 - 580
  • [6] A fully convolutional anchor-free object detector
    Taoshan Zhang
    Zheng Li
    Zhikuan Sun
    Lin Zhu
    [J]. The Visual Computer, 2023, 39 : 569 - 580
  • [7] FCOS: A Simple and Strong Anchor-Free Object Detector
    Tian, Zhi
    Shen, Chunhua
    Chen, Hao
    He, Tong
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (04) : 1922 - 1933
  • [8] ElDet: An Anchor-Free General Ellipse Object Detector
    Wang, Tianhao
    Lu, Changsheng
    Shao, Ming
    Yuan, Xiaohui
    Xia, Siyu
    [J]. COMPUTER VISION - ACCV 2022, PT III, 2023, 13843 : 223 - 238
  • [9] An anchor-free object detector with novel corner matching method
    Ma, Tingsong
    Tian, Wenhong
    Kuang, Ping
    Xie, Yuanlun
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 224
  • [10] MAOD: An Efficient Anchor-Free Object Detector Based on MobileDet
    Chen, Dong
    Shen, Hao
    [J]. IEEE ACCESS, 2020, 8 : 86564 - 86572