Multi-Level Cascade Sparse Representation Learning for Small Data Classification

被引:0
|
作者
Zhong, Wenyuan [1 ]
Li, Huaxiong [1 ]
Hu, Qinghua [2 ,3 ]
Gao, Yang [4 ]
Chen, Chunlin [1 ]
机构
[1] Nanjing Univ, Dept Control Sci & Intelligence Engn, Nanjing 210093, Peoples R China
[2] Tianjin Univ, Sch Comp Sci & Technol, Tianjin 300072, Peoples R China
[3] Tianjin Univ, Tianjin Key Lab Cognit Comp & Applicat, Tianjin 300072, Peoples R China
[4] Nanjing Univ, Dept Comp Sci & Technol, Natl Key Lab Novel Software Technol, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep cascade; sparse representation; face recognition; small data; FACE RECOGNITION; ILLUMINATION; REGRESSION; MODELS;
D O I
10.1109/TCSVT.2022.3222226
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep learning (DL) methods have recently captured much attention for image classification. However, such methods may lead to a suboptimal solution for small-scale data since the lack of training samples. Sparse representation stands out with its efficiency and interpretability, but its precision is not so competitive. We develop a Multi-Level Cascade Sparse Representation (ML-CSR) learning method to combine both advantages when processing small-scale data. ML-CSR is proposed using a pyramid structure to expand the training data size. It adopts two core modules, the Error-To-Feature (ETF) module, and the Generate-Adaptive-Weight (GAW) module, to further improve the precision. ML-CSR calculates the inter-layer differences by the ETF module to increase the diversity of samples and obtains adaptive weights based on the layer accuracy in the GAW module. This helps ML-CSR learn more discriminative features. State-of-the-art results on the benchmark face databases validate the effectiveness of the proposed ML-CSR. Ablation experiments demonstrate that the proposed pyramid structure, ETF, and GAW module can improve the performance of ML-CSR. The code is available at https://github.com/Zhongwenyuan98/ML-CSR.
引用
收藏
页码:2451 / 2464
页数:14
相关论文
共 50 条
  • [21] Multi-level sparse network lasso: Locally sparse learning with flexible sample clusters
    Fei, Luhuan
    Wang, Xinyi
    Wang, Jiankun
    Sun, Lu
    Zhang, Yuyao
    NEUROCOMPUTING, 2025, 635
  • [22] MRLR: Multi-level Representation Learning for Personalized Ranking in Recommendation
    Sun, Zhu
    Yang, Jie
    Zhang, Jie
    Bozzon, Alessandro
    Chen, Yu
    Xu, Chi
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2807 - 2813
  • [23] Effects of a trophic cascade on a multi-level facilitation cascade
    Yakovis, Eugeniy
    Artemieva, Anna
    JOURNAL OF ANIMAL ECOLOGY, 2021, 90 (10) : 2462 - 2470
  • [24] Deep Sparse Representation Learning for Multi-class Image Classification
    Arya, Amit Soni
    Thakur, Shreyanshu
    Mukhopadhyay, Sushanta
    PATTERN RECOGNITION AND MACHINE INTELLIGENCE, PREMI 2023, 2023, 14301 : 218 - 227
  • [25] Multi-level Attentive Skin Lesion Learning for Melanoma Classification
    Wang, Xiaohong
    Huang, Weimin
    Lu, Zhongkang
    Huang, Su
    2021 43RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY (EMBC), 2021, : 3924 - 3927
  • [26] Multi-level graph learning network for hyperspectral image classification
    Wan, Sheng
    Pan, Shirui
    Zhong, Shengwei
    Yang, Jie
    Yang, Jian
    Zhan, Yibing
    Gong, Chen
    PATTERN RECOGNITION, 2022, 129
  • [27] Learning Multi-level Deep Representations for Image Emotion Classification
    Tianrong Rao
    Xiaoxu Li
    Min Xu
    Neural Processing Letters, 2020, 51 : 2043 - 2061
  • [28] Discriminative learning by sparse representation for classification
    Zang, Fei
    Zhang, Jiangshe
    NEUROCOMPUTING, 2011, 74 (12-13) : 2176 - 2183
  • [29] Machine Learning Framework for Multi-Level Classification of Company Revenue
    Choi, Jung-Gu
    Ko, Inhwan
    Kim, Jeongjae
    Jeon, Yeseul
    Han, Sanghoon
    IEEE ACCESS, 2021, 9 : 96739 - 96750
  • [30] Learning Multi-level Deep Representations for Image Emotion Classification
    Rao, Tianrong
    Li, Xiaoxu
    Xu, Min
    NEURAL PROCESSING LETTERS, 2020, 51 (03) : 2043 - 2061