Context-Aware DGCN-Based Ship Formation Recognition in Remote Sensing Images

被引:1
|
作者
Zhang, Tao [1 ]
Yang, Xiaogang [1 ]
Lu, Ruitao [1 ]
Xie, Xueli [1 ]
Wang, Siyu [1 ]
Su, Shuang [1 ]
机构
[1] Rocket Force Univ Engn, Dept Automat Engn, Xian 710025, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
remote sensing; arbitrary-oriented ship detection; ship formation recognition; key point estimation; Delaunay triangulation; context-aware dense graph convolution network;
D O I
10.3390/rs16183435
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Ship detection and formation recognition in remote sensing have increasingly garnered attention. However, research remains challenging due to arbitrary orientation, dense arrangement, and the complex background of ships. To enhance the analysis of ship situations in channels, we model the ships as the key points and propose a context-aware DGCN-based ship formation recognition method. First, we develop a center point-based ship detection subnetwork, which employs depth-separable convolution to reduce parameter redundancy and combines coordinate attention with an oriented response network to generate direction-invariant feature maps. The center point of each ship is predicted by regression of the offset, target scale, and angle to realize the ship detection. Then, we adopt the spatial similarity of the ship center points to cluster the ship group, utilizing the Delaunay triangulation method to establish the topological graph structure of the ship group. Finally, we design a context-aware Dense Graph Convolutional Network (DGCN) with graph structure to achieve formation recognition. Experimental results on HRSD2016 and SGF datasets demonstrate that the proposed method can detect arbitrarily oriented ships and identify formations, attaining state-of-the-art performance.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Context-Aware Personal Route Recognition
    Mazhelis, Oleksiy
    Zliobaite, Indre
    Pechenizkiy, Mykola
    DISCOVERY SCIENCE, 2011, 6926 : 221 - +
  • [22] Progress of ship detection and recognition methods in optical remote sensing images
    Zhao, Qichang
    Wu, Yiquan
    Yuan, Yubin
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2024, 45 (08):
  • [23] Lightweight Context-Aware Activity Recognition
    Go, Byung Gill
    Khattak, Asad Masood
    Shah, Babar
    Khan, Adil Mehmood
    ADVANCED MULTIMEDIA AND UBIQUITOUS ENGINEERING: FUTURE INFORMATION TECHNOLOGY, VOL 2, 2016, 354 : 367 - 373
  • [24] Context-Aware Emotion Recognition Networks
    Lee, Jiyoung
    Kim, Seungryong
    Kim, Sunok
    Park, Jungin
    Sohn, Kwanghoon
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 10142 - 10151
  • [25] Survey on Context-Aware Radio Frequency-Based Sensing
    Casmin, Eugene
    Oliveira, Rodolfo
    SENSORS, 2025, 25 (03)
  • [26] Context-Aware Information Based Ultrasonic Gesture Recognition Method
    Zhong X.
    Chen Y.
    Yu H.
    Yang X.
    Hu Z.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2018, 30 (01): : 173 - 179
  • [27] Context-Aware Emotion Recognition Based on Visual Relationship Detection
    Hoang, Manh-Hung
    Kim, Soo-Hyung
    Yang, Hyung-Jeong
    Lee, Guee-Sang
    IEEE ACCESS, 2021, 9 : 90465 - 90474
  • [28] Geometry-Based Channel Recognition for Context-Aware Applications
    He, Jialin
    Liu, Hui
    Cui, Pengfei
    Landon, Jonathan
    Rajan, Dinesh
    Camp, Joseph
    2016 14TH INTERNATIONAL SYMPOSIUM ON MODELING AND OPTIMIZATION IN MOBILE, AD HOC, AND WIRELESS NETWORKS (WIOPT), 2016, : 86 - 91
  • [29] Context-Aware Convolutional Neural Network for Object Detection in VHR Remote Sensing Imagery
    Gong, Yiping
    Xiao, Zhifeng
    Tan, Xiaowei
    Sui, Haigang
    Xu, Chuan
    Duan, Haiwang
    Li, Deren
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (01): : 34 - 44
  • [30] A context-aware and channel re-weighting network for remote sensing scene classification
    Shi, Aiye
    Li, Ziqi
    Wang, Xin
    REMOTE SENSING LETTERS, 2023, 14 (07) : 703 - 712