ESNet: An Efficient Framework for Superpixel Segmentation

被引:5
|
作者
Xu, Sen [1 ,2 ]
Wei, Shikui [1 ,2 ]
Ruan, Tao [3 ,4 ]
Zhao, Yao [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Inst Informat Sci, Beijing 100044, Peoples R China
[2] Beijing Key Lab Adv Informat Sci & Network Technol, Beijing 100044, Peoples R China
[3] Beijing Jiaotong Univ, Sch Mech Elect & Control Engn, Beijing 100044, Peoples R China
[4] Beijing Jiaotong Univ, Frontiers Sci Ctr Smart High Speed Railway Syst, Beijing 100044, Peoples R China
关键词
Feature extraction; Generators; Image segmentation; Computer architecture; Clustering algorithms; Task analysis; Classification algorithms; Superpixel; segmentation; deep clustering;
D O I
10.1109/TCSVT.2023.3347402
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Superpixel segmentation divides an original image into mid-level regions to reduce the number of computational primitives for subsequent tasks. The two-stage approaches work better but have high computational complexity among the existing deep superpixel algorithms. In contrast, the FCN style approaches cannot extract specific image features for the superpixel task. To combine the advantages of both types of methods, we propose a carefully designed framework termed Efficient Superpixel Network (ESNet) to explicitly enhance the capability of the network to describe clustering-friendly features and simultaneously preserve the simple network structure. Concretely, two points are concerned with ESNet. First, meaningful features need to be constructed for effective superpixel clustering; hence we propose the Pyramid-gradient Superpixel Generator(PSG) to decouple the ESNet into two joint parts, i.e., the feature extractor and the superpixel generator. Second, the superpixel generator is designed in an efficient manner, which performs multi-scale sampling of input images, and can work independently by replacing the introduced feature extractor with two initial convolutional layers. Extensive experiments show that our framework achieves state-of-the-art performances on multi-datasets and is 5.3x smaller on inference than the best existing one-stage FCN-based methods.
引用
收藏
页码:5389 / 5399
页数:11
相关论文
共 50 条
  • [41] Superpixel segmentation based on image density
    Qiu, Dong-Fang
    Yang, Hua
    Deng, Xue-Feng
    Liu, Yan-Hong
    SYSTEMS SCIENCE & CONTROL ENGINEERING, 2023, 11 (01)
  • [42] Text segmentation using superpixel clustering
    Zhu, Yuanping
    Zhang, Kuang
    IET IMAGE PROCESSING, 2017, 11 (07) : 455 - 464
  • [43] Superpixel segmentation: from theory to applications
    Borlido, Isabela
    Belem, Felipe
    Melo, Leonardo
    Theodoro, Taylla M.
    Silva, Ilan F.
    do Patrocinio, Zenilton K. G., Jr.
    Falcao, Alexandre X.
    Guimardes, Silvio Jamil F.
    2023 36TH CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES, SIBGRAPI 2023, 2023, : 258 - 263
  • [44] Dynamic Random Walk for Superpixel Segmentation
    Zhu, Lei
    Kang, Xuejing
    Ming, Anlong
    Zhang, Xuesong
    COMPUTER VISION - ACCV 2018, PT VI, 2019, 11366 : 540 - 554
  • [45] Joint learning framework of superpixel generation and fuzzy sparse subspace clustering for color image segmentation
    Wu, Chengmao
    Zhao, Jingtian
    SIGNAL PROCESSING, 2024, 222
  • [46] Edge Adaptive Seeding for Superpixel Segmentation
    Wilms, Christian
    Frintrop, Simone
    PATTERN RECOGNITION (GCPR 2017), 2017, 10496 : 333 - 344
  • [47] Lazy Random Walks for Superpixel Segmentation
    Shen, Jianbing
    Du, Yunfan
    Wang, Wenguan
    Li, Xuelong
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2014, 23 (04) : 1451 - 1462
  • [48] Superpixel Segmentation by Forgetting Geodesic Distance
    Luo, Bing
    Xiong, Junkai
    Xu, Li
    IEEE ACCESS, 2020, 8 : 195810 - 195819
  • [49] RETHINKING UNSUPERVISED NEURAL SUPERPIXEL SEGMENTATION
    Eliasof, Moshe
    Ben Zikri, Nir
    Treister, Eran
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 3500 - 3504
  • [50] Boundary-preserving Superpixel Segmentation
    Lin, Yuejia
    Li, Zhao
    Yuan, Chenxun
    Liu, Yi
    JOURNAL OF APPLIED SCIENCE AND ENGINEERING, 2023, 27 (04): : 2231 - 2243