Dynamic Attention Loss for Small-Sample Image Classification

被引:0
|
作者
Cao, Jie [1 ]
Qiu, Yinping [1 ]
Chang, Dongliang [2 ]
Li, Xiaoxu [1 ]
Ma, Zhanyu [2 ]
机构
[1] Lanzhou Univ Technol, Lanzhou, Peoples R China
[2] Beijing Univ Posts & Telecommun, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Convolutional Neural Networks (CNNs) have been successfully used in various image classification tasks and gradually become one of the most powerful machine learning approaches. To improve the capability of model generalization and performance on small-sample image classification, a new trend is to learn discriminative features via CNNs. The idea of this paper is to decrease the confusion between categories to extract discriminative features and enlarge inter-class variance, especially for classes which have indistinguishable features. In this paper, we propose a loss function termed as Dynamic Attention Loss (DAL), which introduces confusion rate-weighted soft label (target) as the controller of similarity measurement between categories, dynamically giving corresponding attention to samples especially for those classified wrongly during the training process. Experimental results demonstrate that compared with Cross-Entropy Loss and Focal Loss, the proposed DAL achieved a better performance on the LabelMe dataset and the Caltech101 dataset.
引用
收藏
页码:75 / 79
页数:5
相关论文
共 50 条
  • [31] Is there correlation between the estimated and true classification errors in small-sample settings?
    Ranczar, Blaise
    Hua, B. Jianping
    Dougherty, Edward R.
    [J]. 2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 16 - +
  • [32] SMALL-SAMPLE INTERVALS FOR REGRESSION
    TINGLEY, MA
    [J]. CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 1992, 20 (03): : 271 - 280
  • [33] Student and small-sample theory
    Lehmann, EL
    [J]. STATISTICAL SCIENCE, 1999, 14 (04) : 418 - 426
  • [34] DRnet: Dynamic Retraining for Malicious Traffic Small-Sample Incremental Learning
    Wang, Ruonan
    Fei, Jinlong
    Zhang, Rongkai
    Guo, Maohua
    Qi, Zan
    Li, Xue
    [J]. ELECTRONICS, 2023, 12 (12)
  • [35] Product Processing Quality Classification Model for Small-Sample and Imbalanced Data Environment
    Liu, Feixiang
    Dai, Yiru
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [36] WGAN-CL: A Wasserstein GAN with confidence loss for small-sample augmentation
    Mi, Jiaqi
    Ma, Congcong
    Zheng, Lihua
    Zhang, Man
    Li, Minzan
    Wang, Minjuan
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 233
  • [37] Regularized Common Spatial Pattern With Aggregation for EEG Classification in Small-Sample Setting
    Lu, Haiping
    Eng, How-Lung
    Guan, Cuntai
    Plataniotis, Konstantinos N.
    Venetsanopoulos, Anastasios N.
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2010, 57 (12) : 2936 - 2946
  • [38] Small-Sample Classification for Hyperspectral Images With EPF-Based Smooth Ordering
    Ye, Zhijing
    Zhang, Liming
    Zheng, Chengyong
    Peng, Jiangtao
    Benediktsson, Jon Atli
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [39] SSGAN: A Semantic Similarity-Based GAN for Small-Sample Image Augmentation
    Ma, Congcong
    Mi, Jiaqi
    Gao, Wanlin
    Tao, Sha
    [J]. NEURAL PROCESSING LETTERS, 2024, 56 (03)
  • [40] Radar Moving Target Detection Based on Small-Sample Transfer Learning and Attention Mechanism
    Zhu, Jiang
    Wen, Cai
    Duan, Chongdi
    Wang, Weiwei
    Yang, Xiaochao
    [J]. Remote Sensing, 2024, 16 (22)