Large-Margin Softmax Loss for Convolutional Neural Networks

被引:0
|
作者
Liu, Weiyang [1 ]
Wen, Yandong [2 ]
Yu, Zhiding [3 ]
Yang, Meng [4 ]
机构
[1] Peking Univ, Sch ECE, Beijing, Peoples R China
[2] South China Univ Technol, Sch EIE, Guangzhou, Peoples R China
[3] Carnegie Mellon Univ, Dept ECE, Pittsburgh, PA 15213 USA
[4] Shenzhen Univ, Coll CS & SE, Shenzhen, Peoples R China
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cross-entropy loss together with softmax is arguably one of the most common used supervision components in convolutional neural networks (CNNs). Despite its simplicity, popularity and excellent performance, the component does not explicitly encourage discriminative learning of features. In this paper, we propose a generalized large-margin softmax (L-Softmax) loss which explicitly encourages intra-class compactness and inter-class separability between learned features. Moreover, L-Softmax not only can adjust the desired margin but also can avoid overfitting. We also show that the L-Softmax loss can be optimized by typical stochastic gradient descent. Extensive experiments on four benchmark datasets demonstrate that the deeply-learned features with L-softmax loss become more discriminative, hence significantly boosting the performance on a variety of visual classification and verification tasks.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Large-margin representation learning for texture classification
    de Matos, Jonathan
    de Oliveira, Luiz Eduardo Soares
    Britto Junior, Alceu de Souza
    Koerich, Alessandro Lameiras
    [J]. PATTERN RECOGNITION LETTERS, 2023, 170 : 39 - 47
  • [32] Large-margin classification with multiple decision rules
    Kimes, Patrick K.
    Hayes, David Neil
    Marron, J. S.
    Liu, Yufeng
    [J]. STATISTICAL ANALYSIS AND DATA MINING, 2016, 9 (02) : 89 - 105
  • [33] Large-margin Weakly Supervised Dimensionality Reduction
    Xu, Chang
    Tao, Dacheng
    Xu, Chao
    Rui, Yong
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 865 - 873
  • [34] Softmax Regression Design for Stochastic Computing Based Deep Convolutional Neural Networks
    Yuan, Zihao
    Li, Ji
    Li, Zhe
    Ding, Caiwen
    Ren, Ao
    Yuan, Bo
    Qiu, Qinru
    Draper, Jeffrey
    Wang, Yanzhi
    [J]. PROCEEDINGS OF THE GREAT LAKES SYMPOSIUM ON VLSI 2017 (GLSVLSI' 17), 2017, : 467 - 470
  • [35] STABILITY ENHANCED LARGE-MARGIN CLASSIFIER SELECTION
    Sun, Will Wei
    Cheng, Guang
    Liu, Yufeng
    [J]. STATISTICA SINICA, 2018, 28 (01) : 1 - 25
  • [36] Large-Margin Metric Learning for Constrained Partitioning Problems
    Lajugie, Remi
    Arlot, Sylvain
    Bach, Francis
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 1), 2014, 32
  • [37] Scalable Large-Margin Structured Learning: Theory and Algorithms
    Huang, Liang
    Zhao, Kai
    [J]. 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2015), 2015, : 19 - 20
  • [38] Efficient Private Algorithms for Learning Large-Margin Halfspaces
    Huy Le Nguyen
    Ullman, Jonathan
    Zakynthinou, Lydia
    [J]. ALGORITHMIC LEARNING THEORY, VOL 117, 2020, 117 : 704 - 724
  • [39] Double Additive Margin Softmax Loss for Face Recognition
    Zhou, Shengwei
    Chen, Caikou
    Han, Guojiang
    Hou, Xielian
    [J]. APPLIED SCIENCES-BASEL, 2020, 10 (01):
  • [40] Additive Margin Softmax with Center Loss for Face Recognition
    Jiang, Mingchao
    Yang, Zhenguo
    Liu, Wenyin
    Liu, Xiaochun
    [J]. PROCEEDINGS OF 2018 THE 2ND INTERNATIONAL CONFERENCE ON VIDEO AND IMAGE PROCESSING (ICVIP 2018), 2018, : 1 - 6