Multi-Level Cascade Sparse Representation Learning for Small Data Classification

被引:0
|
作者
Zhong, Wenyuan [1 ]
Li, Huaxiong [1 ]
Hu, Qinghua [2 ,3 ]
Gao, Yang [4 ]
Chen, Chunlin [1 ]
机构
[1] Nanjing Univ, Dept Control Sci & Intelligence Engn, Nanjing 210093, Peoples R China
[2] Tianjin Univ, Sch Comp Sci & Technol, Tianjin 300072, Peoples R China
[3] Tianjin Univ, Tianjin Key Lab Cognit Comp & Applicat, Tianjin 300072, Peoples R China
[4] Nanjing Univ, Dept Comp Sci & Technol, Natl Key Lab Novel Software Technol, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep cascade; sparse representation; face recognition; small data; FACE RECOGNITION; ILLUMINATION; REGRESSION; MODELS;
D O I
10.1109/TCSVT.2022.3222226
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep learning (DL) methods have recently captured much attention for image classification. However, such methods may lead to a suboptimal solution for small-scale data since the lack of training samples. Sparse representation stands out with its efficiency and interpretability, but its precision is not so competitive. We develop a Multi-Level Cascade Sparse Representation (ML-CSR) learning method to combine both advantages when processing small-scale data. ML-CSR is proposed using a pyramid structure to expand the training data size. It adopts two core modules, the Error-To-Feature (ETF) module, and the Generate-Adaptive-Weight (GAW) module, to further improve the precision. ML-CSR calculates the inter-layer differences by the ETF module to increase the diversity of samples and obtains adaptive weights based on the layer accuracy in the GAW module. This helps ML-CSR learn more discriminative features. State-of-the-art results on the benchmark face databases validate the effectiveness of the proposed ML-CSR. Ablation experiments demonstrate that the proposed pyramid structure, ETF, and GAW module can improve the performance of ML-CSR. The code is available at https://github.com/Zhongwenyuan98/ML-CSR.
引用
收藏
页码:2451 / 2464
页数:14
相关论文
共 50 条
  • [31] Specific Emitter Identification Based on Multi-Level Sparse Representation in Automatic Identification System
    Qian, Yunhan
    Qi, Jie
    Kuai, Xiaoyan
    Han, Guangjie
    Sun, Haixin
    Hong, Shaohua
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2021, 16 : 2872 - 2884
  • [32] Coupling importance sampling neural network for imbalanced data classification with multi-level learning bias
    Huang, Zhan ao
    Xiao, Wei
    Yang, Zhipeng
    Li, Xiaojie
    Wu, Xi
    NEUROCOMPUTING, 2025, 623
  • [33] Supervised learning of sparse context reconstruction coefficients for data representation and classification
    Liu, Xuejie
    Wang, Jingbin
    Yin, Ming
    Edwards, Benjamin
    Xu, Peijuan
    NEURAL COMPUTING & APPLICATIONS, 2017, 28 (01): : 135 - 143
  • [34] Supervised learning of sparse context reconstruction coefficients for data representation and classification
    Xuejie Liu
    Jingbin Wang
    Ming Yin
    Benjamin Edwards
    Peijuan Xu
    Neural Computing and Applications, 2017, 28 : 135 - 143
  • [35] Multi-level classification of literacy of educators using PIAAC data
    Yalcin, Seher
    RESEARCH PAPERS IN EDUCATION, 2022, 37 (03) : 441 - 456
  • [36] Classification of medical data with a robust multi-level combination scheme
    Tsirogiannis, GL
    Frossyniotis, D
    Stoitsis, J
    Golemati, S
    Stafylopatis, A
    Nikita, KS
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2483 - 2487
  • [37] Multi-level Connection Enhanced Representation Learning for Script Event Prediction
    Wang, Lihong
    Yue, Juwei
    Guo, Shu
    Sheng, Jiawei
    Mao, Qianren
    Chen, Zhenyu
    Zhong, Shenghai
    Li, Chen
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 3524 - 3533
  • [38] Chinese Named Entity Recognition Based on Multi-Level Representation Learning
    Li, Weijun
    Ding, Jianping
    Liu, Shixia
    Liu, Xueyang
    Su, Yilei
    Wang, Ziyi
    APPLIED SCIENCES-BASEL, 2024, 14 (19):
  • [39] Representation Learning With Multi-Level Attention for Activity Trajectory Similarity Computation
    Liu, An
    Zhang, Yifan
    Zhang, Xiangliang
    Liu, Guanfeng
    Zhang, Yanan
    Li, Zhixu
    Zhao, Lei
    Li, Qing
    Zhou, Xiaofang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (05) : 2387 - 2400
  • [40] Efficient Fair Graph Representation Learning Using a Multi-level Framework
    He, Yuntian
    Gurukar, Saket
    Parthasarathy, Srinivasan
    COMPANION OF THE WORLD WIDE WEB CONFERENCE, WWW 2023, 2023, : 298 - 301