ReBiDet: An Enhanced Ship Detection Model Utilizing ReDet and Bi-Directional Feature Fusion

被引:3
|
作者
Yan, Zexin [1 ]
Li, Zhongbo [1 ]
Xie, Yongqiang [1 ]
Li, Chengyang [1 ,2 ]
Li, Shaonan [1 ]
Sun, Fangwei [1 ]
机构
[1] Acad Mil Sci, Inst Syst Engn, Beijing 100000, Peoples R China
[2] Peking Univ, Sch Comp Sci, Beijing 100000, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 12期
关键词
artificial intelligence; deep learning; remote sensing images; ship detection; bi-directional feature fusion; feature pyramid network; anchor size; K-means; sampler;
D O I
10.3390/app13127080
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
To enhance ship detection accuracy in the presence of complex scenes and significant variations in object scales, this study introduces three enhancements to ReDet, resulting in a more powerful ship detection model called rotation-equivariant bidirectional feature fusion detector (ReBiDet). Firstly, the feature pyramid network (FPN) structure in ReDet is substituted with a rotation-equivariant bidirectional feature fusion feature pyramid network (ReBiFPN) to effectively capture and enrich multiscale feature information. Secondly, K-means clustering is utilized to group the aspect ratios of ground truth boxes in the dataset and adjust the anchor size settings accordingly. Lastly, the difficult positive reinforcement learning (DPRL) sampler is employed instead of the random sampler to address the scale imbalance issue between objects and backgrounds in the dataset, enabling the model to prioritize challenging positive examples. Through numerous experiments conducted on the HRSC2016 and DOTA remote sensing image datasets, the effectiveness of the proposed improvements in handling complex environments and small object detection tasks is validated. The ReBiDet model demonstrates state-of-the-art performance in remote sensing object detection tasks. Compared to the ReDet model and other advanced models, our ReBiDet achieves mAP improvements of 3.20, 0.42, and 1.16 on HRSC2016, DOTA-v1.0, and DOTA-v1.5, respectively, with only a slight increase of 0.82 million computational parameters.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] Bi-directional Feature Fusion for Fast and Accurate Text Detection of Arbitrary Shapes
    Bian Liang
    Qu Yadong
    Zhou Yu
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2021, 43 (04) : 931 - 938
  • [2] Closely arranged inshore ship detection using a bi-directional attention feature pyramid network
    Guo, Hao
    Gu, Dongbing
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2023, 44 (22) : 7106 - 7125
  • [4] EBiDA-FPN: enhanced bi-directional attention feature pyramid network for object detection
    Yang, Xiaobao
    He, Yulong
    Wu, Junsheng
    Wang, Wentao
    Sun, Wei
    Ma, Sugang
    Hou, Zhiqiang
    JOURNAL OF ELECTRONIC IMAGING, 2024, 33 (02)
  • [5] Efficient Vehicle Detection in Remote Sensing Images with Bi-Directional Multi-Scale Feature Fusion
    Qu, Haicheng
    Wang, Meng
    Chai, Rui
    Computer Engineering and Applications, 2024, 60 (12) : 346 - 356
  • [6] Object Detection Using Improved Bi-Directional Feature Pyramid Network
    Quang, Tran Ngoc
    Lee, Seunghyun
    Song, Byung Cheol
    ELECTRONICS, 2021, 10 (06) : 1 - 10
  • [7] Feature pyramid of bi-directional stepped concatenation for small object detection
    Zheng, Qiyuan
    Chen, Ying
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (13) : 20283 - 20305
  • [8] Feature pyramid of bi-directional stepped concatenation for small object detection
    Qiyuan Zheng
    Ying Chen
    Multimedia Tools and Applications, 2021, 80 : 20283 - 20305
  • [9] Electricity Theft Detection in Smart Meters Using a Hybrid Bi-directional GRU Bi-directional LSTM Model
    Munawar, Shoaib
    Asif, Muhammad
    Kabir, Beenish
    Pamir
    Ullah, Ashraf
    Javaid, Nadeem
    COMPLEX, INTELLIGENT AND SOFTWARE INTENSIVE SYSTEMS, CISIS-2021, 2021, 278 : 297 - 308
  • [10] Ocean Front Detection With Bi-Directional Progressive Fusion Attention Network
    Zhu, Jing
    Li, Qingyang
    Xie, Cui
    Zhong, Guoqiang
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20