Deep neural networks with L1 and L2 regularization for high dimensional corporate credit risk prediction

被引:28
|
作者
Yang, Mei [1 ]
Lim, Ming K. [4 ]
Qu, Yingchi [1 ]
Li, Xingzhi [3 ]
Ni, Du [2 ]
机构
[1] Chongqing Univ, Sch Econ & Business Adm, Chongqing 400030, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Sch Management, Jiangsu 210003, Peoples R China
[3] Chongqing Jiaotong Univ, Sch Econ & Management, Chongqing 400074, Peoples R China
[4] Univ Glasgow, Adam Smith Business Sch, Glasgow G14 8QQ, Scotland
关键词
High dimensional data; Credit risk; Deep neural network; Prediction; L1; regularization; SUPPORT VECTOR MACHINES; FEATURE-SELECTION; DECISION-MAKING; MODELS; CLASSIFICATION; SVM;
D O I
10.1016/j.eswa.2022.118873
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate credit risk prediction can help companies avoid bankruptcies and make adjustments ahead of time. There is a tendency in corporate credit risk prediction that more and more features are considered in the pre-diction system. However, this often brings redundant and irrelevant information which greatly impairs the performance of prediction algorithms. Therefore, this study proposes an HDNN algorithm that is an improved deep neural network (DNN) algorithm and can be used for high dimensional prediction of corporate credit risk. We firstly theoretically proved that there was no regularization effect when L1 regularization was added to the batch normalization layer of the DNN, which was a hidden rule in the industrial implementation but never been proved. In addition, we proved that adding L2 constraints on a single L1 regularization can solve the issue. Finally, this study analyzed a case study of credit data with supply chain and network data to show the supe-riority of the HDNN algorithm in the scenario of a high dimensional dataset.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] The Relation Between L1 and L2
    王慧
    海外英语, 2017, (12) : 218 - 219+236
  • [42] Processing Hyponymy in L1 and L2
    Farzad Sharifian
    Journal of Psycholinguistic Research, 2002, 31 : 421 - 436
  • [43] The Influence of L1 on L2 Learning
    Bao Bo
    校园英语, 2020, (10) : 196 - 198
  • [44] A projected gradient method for nonlinear inverse problems with αl1 - βl2 sparsity regularization
    Zhao, Zhuguang
    Ding, Liang
    JOURNAL OF INVERSE AND ILL-POSED PROBLEMS, 2024, 32 (03): : 513 - 528
  • [45] A blind deconvolution method based on L1/L2 regularization priors in the gradient space
    Cai, Ying
    Shi, Yu
    Hua, Xia
    MIPPR 2017: MULTISPECTRAL IMAGE ACQUISITION, PROCESSING, AND ANALYSIS, 2018, 10607
  • [46] L1 in the L2 classroom: why not?
    Galindo Merino, Ma Mar
    ESTUDIOS DE LINGUISTICA-UNIVERSIDAD DE ALICANTE-ELUA, 2011, (25): : 163 - 204
  • [47] Processing hyponymy in L1 and L2
    Sharifian, F
    JOURNAL OF PSYCHOLINGUISTIC RESEARCH, 2002, 31 (04) : 421 - 436
  • [48] L1 - L2 Robust Estimation in Prediction Error System Identification
    Corbier, Ch.
    Carmona, J. C.
    Alvarado, V. M.
    2009 6TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, COMPUTING SCIENCE AND AUTOMATION CONTROL (CCE 2009), 2009, : 60 - +
  • [49] Prosody Perception in L1 and L2
    Nesterenko, Irina
    PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON SPEECH PROSODY, VOLS I AND II, 2012, : 398 - 401
  • [50] THE L1 = L2 HYPOTHESIS - A RECONSIDERATION
    ELLIS, R
    SYSTEM, 1985, 13 (01) : 9 - 24