Discriminative feature learning through feature distance loss

被引:0
|
作者
Schlagenhauf, Tobias [1 ]
Lin, Yiwen [1 ]
Noack, Benjamin [2 ]
机构
[1] Wbk Inst Prod Sci, Karlsruhe Inst Technol, Karlsruhe, Germany
[2] Otto Von Guericke Univ, Inst Intelligent Cooperating Syst, Magdeburg, Germany
关键词
Deep learning; Convolutional neural network; Feature fusion model; Distance function; Semantic feature concept;
D O I
10.1007/s00138-023-01379-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensembles of convolutional neural networks have shown remarkable results in learning discriminative semantic features for image classification tasks. However, the models in the ensemble often concentrate on similar regions in images. This work proposes a novel method that forces a set of base models to learn different features for a classification task. These models are combined in an ensemble to make a collective classification. The key finding is that by forcing the models to concentrate on different features, the classification accuracy is increased. To learn different feature concepts, a so-called feature distance loss is implemented on the feature maps. The experiments on benchmark convolutional neural networks (VGG16, ResNet, AlexNet), popular datasets (Cifar10, Cifar100, miniImageNet, NEU, BSD, TEX), and different training samples (3, 5, 10, 20, 50, 100 per class) show the effectiveness of the proposed feature loss. The proposed method outperforms classical ensemble versions of the base models. The Class Activation Maps explicitly prove the ability to learn different feature concepts. The code is available at: https://github.com/2Obe/Feature-Distance-Loss.git.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Discriminative feature learning through feature distance loss
    Tobias Schlagenhauf
    Yiwen Lin
    Benjamin Noack
    [J]. Machine Vision and Applications, 2023, 34
  • [2] Unsupervised Feature Learning Through Divergent Discriminative Feature Accumulation
    Szerlip, Paul A.
    Morse, Gregory
    Pugh, Justin K.
    Stanley, Kenneth O.
    [J]. PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2979 - 2985
  • [3] Recognizing Trees at a Distance with Discriminative Deep Feature Learning
    Zuo, Zhen
    Wang, Gang
    [J]. 2013 9TH INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATIONS AND SIGNAL PROCESSING (ICICS), 2013,
  • [4] Fisher Loss: A More Discriminative Feature Learning Method in Classification
    Ye, Yuhang
    Zhang, Tong
    Yang, Chenguang
    [J]. 2019 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2019, : 746 - 751
  • [5] Feature mining through distance minimization learning
    Thomson, J
    Gantenbein, R
    Nielson, T
    [J]. PROCEEDINGS OF THE ISCA 12TH INTERNATIONAL CONFERENCE INTELLIGENT AND ADAPTIVE SYSTEMS AND SOFTWARE ENGINEERING, 2003, : 110 - 113
  • [6] FACE ALIGNMENT BY DISCRIMINATIVE FEATURE LEARNING
    Chen, Weiliang
    Zhou, Qiang
    Hu, Haoji
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 2204 - 2208
  • [7] Unsupervised feature learning with discriminative encoder
    Pandey, Gaurav
    Dukkipati, Ambedkar
    [J]. 2017 17TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2017, : 367 - 376
  • [8] Learning a discriminative feature for object detection based on feature fusing and context learning
    You Lei
    Wang Hongpeng
    Wang Yuan
    [J]. 2017 INTERNATIONAL CONFERENCE ON SECURITY, PATTERN ANALYSIS, AND CYBERNETICS (SPAC), 2017, : 543 - 547
  • [9] Learning from discriminative feature feedback
    Dasgupta, Sanjoy
    Dey, Akansha
    Roberts, Nicholas
    Sabato, Sivan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [10] Sparse discriminative feature weights learning
    Yan, Hui
    Yang, Jian
    [J]. NEUROCOMPUTING, 2016, 173 : 1936 - 1942