Output Layer Multiplication for Class Imbalance Problem in Convolutional Neural Networks

被引:0
|
作者
Zhao Yang
Yuanxin Zhu
Tie Liu
Sai Zhao
Yunyan Wang
Dapeng Tao
机构
[1] Guangzhou University,School of Mechanical and Electric Engineering
[2] Hubei University of Technology,School of Electrical and Electronic Engineering
[3] Yunnan University,School of Information Science and Engineering
来源
Neural Processing Letters | 2020年 / 52卷
关键词
Convolutional neural networks; Imbalance learning; Output layer multiplication;
D O I
暂无
中图分类号
学科分类号
摘要
Convolutional neural networks (CNNs) have demonstrated remarkable performance in the field of computer vision. However, they are prone to suffer from the class imbalance problem, in which the number of some classes is significantly higher or lower than that of other classes. Commonly, there are two main strategies to handle the problem, including dataset-level methods via resampling and algorithmic-level methods by modifying the existing learning frameworks. However, most of these methods need extra data resampling or elaborate algorithm design. In this work we provide an effective but extremely simple approach to tackle the imbalance problem in CNNs with cross-entropy loss. Specifically, we multiply a coefficient α>1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \alpha > 1 $$\end{document} to output of the last layer in a CNN model. With this modification, the final loss function can dynamically adjust the contributions of examples from different classes during the imbalanced training procedure. Because of its simplicity, the proposed method can be easily applied in the off-the-shelf models with little change. To prove the effectiveness on imbalance problem, we design three experiments on classification tasks of increasing complexity. The experimental results show that our approach could improve the convergence rate in the training stage and/or increase accuracy for test.
引用
收藏
页码:2637 / 2653
页数:16
相关论文
共 50 条
  • [31] Using Generative Adversarial Networks for Handling Class Imbalance Problem
    Aydin, M. Asli
    [J]. 29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [32] Do Convolutional Neural Networks Learn Class Hierarchy?
    Alsallakh, Bilal
    Jourabloo, Amin
    Ye, Mao
    Liu, Xiaoming
    Ren, Liu
    [J]. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2018, 24 (01) : 152 - 162
  • [33] Learn class hierarchy using convolutional neural networks
    La Grassa, Riccardo
    Gallo, Ignazio
    Landro, Nicola
    [J]. APPLIED INTELLIGENCE, 2021, 51 (10) : 6622 - 6632
  • [34] Learn class hierarchy using convolutional neural networks
    Riccardo La Grassa
    Ignazio Gallo
    Nicola Landro
    [J]. Applied Intelligence, 2021, 51 : 6622 - 6632
  • [35] Binary Output Layer of Feedforward Neural Networks for Solving Multi-Class Classification Problems
    Yang, Sibo
    Zhang, Chao
    Wu, Wei
    [J]. IEEE ACCESS, 2019, 7 : 5085 - 5094
  • [36] Accelerating Convolutional Neural Networks by Exploiting the Sparsity of Output Activation
    Fan, Zhihua
    Li, Wenming
    Wang, Zhen
    Liu, Tianyu
    Wu, Haibin
    Liu, Yanhuan
    Wu, Meng
    Wu, Xinxin
    Ye, Xiaochun
    Fan, Dongrui
    Sun, Ninghui
    An, Xuejun
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (12) : 3253 - 3265
  • [37] Global Entropy Pooling layer for Convolutional Neural Networks
    Filus, Katarzyna
    Domanska, Joanna
    [J]. NEUROCOMPUTING, 2023, 555
  • [38] ConvFusion: A Model for Layer Fusion in Convolutional Neural Networks
    Waeijen, Luc
    Sioutas, Savvas
    Peemen, Maurice
    Lindwer, Menno
    Corporaal, Henk
    [J]. IEEE ACCESS, 2021, 9 : 168245 - 168267
  • [39] Layer factor analysis in convolutional neural networks for explainability
    Lopez-Gonzalez, Clara I.
    Gomez-Silva, Maria J.
    Besada-Portas, Eva
    Pajares, Gonzalo
    [J]. APPLIED SOFT COMPUTING, 2024, 150
  • [40] RRL: Regional Rotate Layer in Convolutional Neural Networks
    Hao, Zongbo
    Zhang, Tao
    Chen, Mingwang
    Kaixu, Zou
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 826 - 833