Deep neural networks with L1 and L2 regularization for high dimensional corporate credit risk prediction

被引:28
|
作者
Yang, Mei [1 ]
Lim, Ming K. [4 ]
Qu, Yingchi [1 ]
Li, Xingzhi [3 ]
Ni, Du [2 ]
机构
[1] Chongqing Univ, Sch Econ & Business Adm, Chongqing 400030, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Sch Management, Jiangsu 210003, Peoples R China
[3] Chongqing Jiaotong Univ, Sch Econ & Management, Chongqing 400074, Peoples R China
[4] Univ Glasgow, Adam Smith Business Sch, Glasgow G14 8QQ, Scotland
关键词
High dimensional data; Credit risk; Deep neural network; Prediction; L1; regularization; SUPPORT VECTOR MACHINES; FEATURE-SELECTION; DECISION-MAKING; MODELS; CLASSIFICATION; SVM;
D O I
10.1016/j.eswa.2022.118873
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate credit risk prediction can help companies avoid bankruptcies and make adjustments ahead of time. There is a tendency in corporate credit risk prediction that more and more features are considered in the pre-diction system. However, this often brings redundant and irrelevant information which greatly impairs the performance of prediction algorithms. Therefore, this study proposes an HDNN algorithm that is an improved deep neural network (DNN) algorithm and can be used for high dimensional prediction of corporate credit risk. We firstly theoretically proved that there was no regularization effect when L1 regularization was added to the batch normalization layer of the DNN, which was a hidden rule in the industrial implementation but never been proved. In addition, we proved that adding L2 constraints on a single L1 regularization can solve the issue. Finally, this study analyzed a case study of credit data with supply chain and network data to show the supe-riority of the HDNN algorithm in the scenario of a high dimensional dataset.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Compact Deep Neural Networks with l1,1 and l1,2 Regularization
    Ma, Rongrong
    Niu, Lingfeng
    2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 1248 - 1254
  • [2] Stochastic PCA with l2 and l1 Regularization
    Mianjy, Poorya
    Arora, Raman
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [3] αl1 - βl2 regularization for sparse recovery
    Ding, Liang
    Han, Weimin
    INVERSE PROBLEMS, 2019, 35 (12)
  • [4] ELM with L1/L2 regularization constraints
    Feng B.
    Qin K.
    Jiang Z.
    Hanjie Xuebao/Transactions of the China Welding Institution, 2018, 39 (09): : 31 - 35
  • [5] An {l1, l2, l∞}-regularization approach to high-dimensional errors-in-variables models
    Belloni, Alexandre
    Rosenbaum, Mathieu
    Tsybakov, Alexandre B.
    ELECTRONIC JOURNAL OF STATISTICS, 2016, 10 (02): : 1729 - 1750
  • [6] Transformed l1 regularization for learning sparse deep neural networks
    Ma, Rongrong
    Miao, Jianyu
    Niu, Lingfeng
    Zhang, Peng
    NEURAL NETWORKS, 2019, 119 : 286 - 298
  • [7] Sentiment Analysis of Tweets by Convolution Neural Network with L1 and L2 Regularization
    Rangra, Abhilasha
    Sehgal, Vivek Kumar
    Shukla, Shailendra
    ADVANCED INFORMATICS FOR COMPUTING RESEARCH, ICAICR 2018, PT I, 2019, 955 : 355 - 365
  • [8] An l2/l1 regularization framework for diverse learning tasks
    Wang, Shengzheng
    Peng, Jing
    Liu, Wei
    SIGNAL PROCESSING, 2015, 109 : 206 - 211
  • [9] On the training dynamics of deep networks with L2 regularization
    Lewkowycz, Aitor
    Gur-Ari, Guy
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [10] A novel method for financial distress prediction based on sparse neural networks with L1/2 regularization
    Chen, Ying
    Guo, Jifeng
    Huang, Junqin
    Lin, Bin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (07) : 2089 - 2103