Dual-path Processing Network for High-resolution Salient Object Detection

被引:6
|
作者
Wang, Jun [1 ,2 ]
Yang, Qingpeng [1 ]
Yang, Shangqin [1 ]
Chai, Xiuli [1 ,2 ]
Zhang, Wanjun [1 ,2 ]
机构
[1] Henan Univ, Sch Artificial Intelligence, Kaifeng 475004, Peoples R China
[2] Henan Univ, Henan Key Lab Big Data Anal & Proc, Kaifeng 475004, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; Salient object detection; High-resolution; Multiscale semantic feature; Boundary information;
D O I
10.1007/s10489-021-02971-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the trend of deep learning, salient object detection (SOD) has achieved preeminent performance over the years. Nevertheless, scant work has been done to address the contradiction between excellent saliency map segmentation and the computational consumption required by high-resolution input images (e.g., 1024x2048 pixels or more). To meet this challenge, we introduce a dual-path processing network (DPPNet) to detect and segment salient objects in high-resolution input images directly and efficiently, the network contains global context and spatial details paths. Specifically, the global context path utilizes a multi feature extraction and enhanced (MFEE) module to extract richer global multiscale semantic features with a large receptive field at a lower resolution. The spatial details path employs a boundary information guided (BIG) module to focus on accurate saliency objects location and maintain local boundary information at a higher resolution. Guided by the BIG module, a feature fusion unit (FFU) is further employed to heighten the spatial consistency of maps at different levels and boost the robustness of the network. Extensive evaluations on two high-resolution SOD datasets and four low-resolution SOD mainstream datasets indicate that the method we proposed can settle the challenge of high-resolution image input effectively and exceed ten top-notch comparison algorithms.
引用
收藏
页码:12034 / 12048
页数:15
相关论文
共 50 条
  • [31] High-Resolution Polar Network for Object Detection in Remote Sensing Images
    He, Xu
    Ma, Shiping
    He, Linyuan
    Ru, Le
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [32] High-Resolution Polar Network for Object Detection in Remote Sensing Images
    He, Xu
    Ma, Shiping
    He, Linyuan
    Ru, Le
    IEEE Geoscience and Remote Sensing Letters, 2022, 19
  • [33] Real-Time Dense Semantic Labeling with Dual-Path Framework for High-Resolution Remote Sensing Image
    Wang, Yuhao
    Chen, Chen
    Ding, Meng
    Li, Jiangyun
    REMOTE SENSING, 2019, 11 (24)
  • [34] Cross-scale resolution consistent network for salient object detection
    Huang, Xiaoyu
    Liu, Wei
    Li, Minghui
    Nie, Hangyu
    IET IMAGE PROCESSING, 2024, 18 (10) : 2788 - 2799
  • [35] Dual-branch mutual assistance network for salient object detection
    Yao, Zhaojian
    Wang, Luping
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (01) : 972 - 990
  • [36] Dual guidance enhanced network for light field salient object detection
    Liang, Yanhua
    Qin, Guihe
    Sun, Minghui
    Qin, Jun
    Yan, Jie
    Zhang, Zhonghan
    IMAGE AND VISION COMPUTING, 2022, 118
  • [37] Dual-Branch Feature Fusion Network for Salient Object Detection
    Song, Zhehan
    Xu, Zhihai
    Wang, Jing
    Feng, Huajun
    Li, Qi
    PHOTONICS, 2022, 9 (01)
  • [38] Dual guidance enhanced network for light field salient object detection
    Liang, Yanhua
    Qin, Guihe
    Sun, Minghui
    Qin, Jun
    Yan, Jie
    Zhang, Zhonghan
    Image and Vision Computing, 2022, 118
  • [39] Progressive Dual-Attention Residual Network for Salient Object Detection
    Zhang, Liqian
    Zhang, Qing
    Zhao, Rui
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (09) : 5902 - 5915
  • [40] SpermDet: Structure-Aware Network With Local Context Enhancement and Dual-Path Fusion for Object Detection in Sperm Images
    Zhang, Hongyu
    Hu, Zhujun
    Huang, Huaying
    Liu, Shuang
    Rao, Yunbo
    Wang, Qifei
    Ahmad, Naveed
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74