A feature aggregation network for multispectral pedestrian detection

被引:0
|
作者
Yan Gong
Lu Wang
Lisheng Xu
机构
[1] Northeastern University,School of Computer Science and Engineering
[2] Northeastern University,College of Medicine and Biomedical Information Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
Multispectral pedestrian detection; Feature aggregation; Saliency map; Attention mechanism;
D O I
暂无
中图分类号
学科分类号
摘要
Pedestrian detection is an important task in many computer vision applications. Since multispectral pedestrian detection can alleviate the difficulties of insufficient illumination at night, it has been rapidly developed in recent years. However, the way for effective color-thermal image fusion still needs further research. In this paper, we propose a Feature Aggregation Module (FAM) that can adaptively capture the cross-channel and cross-dimension information interaction of the two modalities. In addition, we develop a Feature Aggregation Network (FANet) that embeds the proposed FAM module into a two-stream network adapted from the YOLOv5. FANet has the advantages that its size is small (15 MB) and it runs fast (8 ms per frame). Extensive experiments on the KAIST dataset show that the proposed method is effective for multispectral pedestrian detection, especially in the night-time condition, for which the Miss Rate is only 8.91%. Moreover, we show that the saliency map computed from the thermal image can be incorporated into FANet to further improve the detection accuracy. In order to verify the generalization ability of the FAM module, we have also conducted experiments on the person re-identification datasets, namely Market1501 and Duke. The performance of our FAM compares favorably against existing feature fusion mechanisms on the two datasets.
引用
收藏
页码:22117 / 22131
页数:14
相关论文
共 50 条
  • [1] A feature aggregation network for multispectral pedestrian detection
    Gong, Yan
    Wang, Lu
    Xu, Lisheng
    APPLIED INTELLIGENCE, 2023, 53 (19) : 22117 - 22131
  • [2] A multispectral feature fusion network for robust pedestrian detection
    Song, Xiaoru
    Gao, Song
    Chen, Chaobo
    ALEXANDRIA ENGINEERING JOURNAL, 2021, 60 (01) : 73 - 85
  • [3] Trans-scale feature aggregation network for multiscale pedestrian detection
    Cao S.
    Zhang X.
    Ma J.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2020, 46 (09): : 1786 - 1796
  • [4] Improving multispectral pedestrian detection with scale-aware permutation attention and adjacent feature aggregation
    Zuo, Xin
    Wang, Zhi
    Shen, Jifeng
    Yang, Wankou
    IET COMPUTER VISION, 2023, 17 (07) : 726 - 738
  • [5] Adaptive spatial pixel-level feature fusion network for multispectral pedestrian detection
    Fu, Lei
    Gu, Wen-bin
    Ai, Yong-bao
    Li, Wei
    Wang, Dong
    INFRARED PHYSICS & TECHNOLOGY, 2021, 116
  • [6] Guided Attentive Feature Fusion for Multispectral Pedestrian Detection
    Zhang, Heng
    Fromont, Elisa
    Lefevre, Sebastien
    Avignon, Bruno
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 72 - 80
  • [7] Adaptive spatial pixel-level feature fusion network for multispectral pedestrian detection
    Fu, Lei
    Gu, Wen-bin
    Ai, Yong-bao
    Li, Wei
    Wang, Dong
    Infrared Physics and Technology, 2021, 116
  • [8] Multispectral pedestrian detection based on feature complementation and enhancement
    Nie, Linzhen
    Lu, Meihe
    He, Zhiwei
    Hu, Jiachen
    Yin, Zhishuai
    IET INTELLIGENT TRANSPORT SYSTEMS, 2024, 18 (11) : 2166 - 2177
  • [9] Attentive Alignment Network for Multispectral Pedestrian Detection
    Chen, Nuo
    Xie, Jin
    Nie, Jing
    Cao, Jiale
    Shao, Zhuang
    Pang, Yanwei
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 3787 - 3795
  • [10] Locality guided cross-modal feature aggregation and pixel-level fusion for multispectral pedestrian detection
    Cao, Yanpeng
    Luo, Xing
    Yang, Jiangxin
    Cao, Yanlong
    Yang, Michael Ying
    INFORMATION FUSION, 2022, 88 : 1 - 11