Neural Collapse in Deep Linear Networks: From Balanced to Imbalanced Data

被引:0
|
作者
Dang, Hien [1 ]
Tran, Tho [1 ]
Osher, Stanley [2 ]
Tran-The, Hung [3 ]
Ho, Nhat [4 ]
Nguyen, Tan [5 ]
机构
[1] FPT Software AI Ctr, Ho Chi Minh City, Vietnam
[2] Univ Calif Los Angeles, Dept Math, Los Angeles, CA USA
[3] Deakin Univ, Appl Artificial Intelligence Inst, Geelong, Vic, Australia
[4] Univ Texas Austin, Dept Stat & Data Sci, Austin, TX USA
[5] Natl Univ Singapore, Dept Math, Singapore, Singapore
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modern deep neural networks have achieved impressive performance on tasks from image classification to natural language processing. Surprisingly, these complex systems with massive amounts of parameters exhibit the same structural properties in their last-layer features and classifiers across canonical datasets when training until convergence. In particular, it has been observed that the last-layer features collapse to their class-means, and those class-means are the vertices of a simplex Equiangular Tight Frame (ETF). This phenomenon is known as Neural Collapse (NC). Recent papers have theoretically shown that NC emerges in the global minimizers of training problems with the simplified "unconstrained feature model". In this context, we take a step further and prove the NC occurrences in deep linear networks for the popular mean squared error (MSE) and cross entropy (CE) losses, showing that global solutions exhibit NC properties across the linear layers. Furthermore, we extend our study to imbalanced data for MSE loss and present the first geometric analysis of NC under bias-free setting. Our results demonstrate the convergence of the last-layer features and classifiers to a geometry consisting of orthogonal vectors, whose lengths depend on the amount of data in their corresponding classes. Finally, we empirically validate our theoretical analyses on synthetic and practical network architectures with both balanced and imbalanced scenarios.
引用
收藏
页数:75
相关论文
共 50 条
  • [31] Piecewise linear neural networks and deep learning
    Nature Reviews Methods Primers, 2 (1):
  • [32] On the Number of Linear Regions of Deep Neural Networks
    Montufar, Guido
    Pascanu, Razvan
    Cho, Kyunghyun
    Bengio, Yoshua
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [33] Product backorder prediction using deep neural network on imbalanced data
    Shajalal, Md
    Hajek, Petr
    Abedin, Mohammad Zoynul
    INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH, 2023, 61 (01) : 302 - 319
  • [34] Myocardial infarction detection based on deep neural network on imbalanced data
    Mohamed Hammad
    Monagi H. Alkinani
    B. B. Gupta
    Ahmed A. Abd El-Latif
    Multimedia Systems, 2022, 28 : 1373 - 1385
  • [35] Myocardial infarction detection based on deep neural network on imbalanced data
    Hammad, Mohamed
    Alkinani, Monagi H.
    Gupta, B. B.
    Abd El-Latif, Ahmed A.
    MULTIMEDIA SYSTEMS, 2022, 28 (04) : 1373 - 1385
  • [36] Balanced Neighborhood Classifiers for Imbalanced Data Sets
    Zhu, Shunzhi
    Ma, Ying
    Pan, Weiwei
    Zhu, Xiatian
    Luo, Guangchun
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2014, E97D (12): : 3226 - 3229
  • [37] Balanced random forest for imbalanced data streams
    Yagci, A. Murat
    Aytekin, Tevfik
    Gurgen, Fikret S.
    2016 24TH SIGNAL PROCESSING AND COMMUNICATION APPLICATION CONFERENCE (SIU), 2016, : 1065 - 1068
  • [38] Deep balanced domain adaptation neural networks for fault diagnosis of planetary gearboxes with limited labeled data
    Li, Qikang
    Tang, Baoping
    Deng, Lei
    Wu, Yanling
    Wang, Yi
    MEASUREMENT, 2020, 156
  • [39] Toward a Balanced Feature Space for the Deep Imbalanced Regression
    Lee, Jangho
    IEEE ACCESS, 2023, 11 : 92565 - 92574
  • [40] Cancer diagnosis using generative adversarial networks based on deep learning from imbalanced data
    Xiao, Yawen
    Wu, Jun
    Lin, Zongli
    COMPUTERS IN BIOLOGY AND MEDICINE, 2021, 135 (135)