A novel method for financial distress prediction based on sparse neural networks with L1/2 regularization

被引:0
|
作者
Chen, Ying [1 ]
Guo, Jifeng [2 ]
Huang, Junqin [3 ]
Lin, Bin [3 ,4 ]
机构
[1] South China Normal Univ, Int Business Coll, Guangzhou 510631, Peoples R China
[2] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
[3] Sun Yat Sen Univ, Sch Business, Guangzhou 510275, Peoples R China
[4] Guangdong Ind Polytech, Guangzhou 510300, Peoples R China
基金
中国国家自然科学基金;
关键词
Financial distress prediction; Features selection; Sparse neural networks; L-1/2; regularization; BANKRUPTCY PREDICTION; DISCRIMINANT-ANALYSIS; FIRMS; SELECTION; RATIOS; REGRESSION; ABILITY; RISK;
D O I
10.1007/s13042-022-01566-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Corporate financial distress is related to the interests of the enterprise and stakeholders. Therefore, its accurate prediction is of great significance to avoid huge losses from them. Despite significant effort and progress in this field, the existing prediction methods are either limited by the number of input variables or restricted to those financial predictors. To alleviate those issues, both financial variables and non-financial variables are screened out from the existing accounting and finance theory to use as financial distress predictors. In addition, a novel method for financial distress prediction (FDP) based on sparse neural networks is proposed, namely FDP-SNN, in which the weight of the hidden layer is constrained with L-1/2 regularization to achieve the sparsity, so as to select relevant and important predictors, improving the predicted accuracy. It also provides support for the interpretability of the model. The results show that non-financial variables, such as investor protection and governance structure, play a key role in financial distress prediction than those financial ones, especially when the forecast period grows longer. By comparing those classic models proposed by predominant researchers in accounting and finance, the proposed model outperforms in terms of accuracy, precision, and AUC performance.
引用
收藏
页码:2089 / 2103
页数:15
相关论文
共 50 条
  • [1] Sparse smooth group L0°L1/2 regularization method for convolutional neural networks
    Quasdane, Mohamed
    Ramchoun, Hassan
    Masrour, Tawfik
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [2] Transformed l1 regularization for learning sparse deep neural networks
    Ma, Rongrong
    Miao, Jianyu
    Niu, Lingfeng
    Zhang, Peng
    NEURAL NETWORKS, 2019, 119 : 286 - 298
  • [3] Sparse SAR imaging based on L1/2 regularization
    ZENG JinShan
    Science China(Information Sciences), 2012, 55 (08) : 1755 - 1775
  • [4] Sparse SAR imaging based on L1/2 regularization
    JinShan Zeng
    Jian Fang
    ZongBen Xu
    Science China Information Sciences, 2012, 55 : 1755 - 1775
  • [5] Compact Deep Neural Networks with l1,1 and l1,2 Regularization
    Ma, Rongrong
    Niu, Lingfeng
    2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 1248 - 1254
  • [6] A Simple Neural Network for Sparse Optimization With l1 Regularization
    Ma, Litao
    Bian, Wei
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (04): : 3430 - 3442
  • [7] Sparse kernel logistic regression based on L1/2 regularization
    Xu Chen
    Peng ZhiMing
    Jing WenFeng
    SCIENCE CHINA-INFORMATION SCIENCES, 2013, 56 (04) : 1 - 16
  • [8] Sparse kernel logistic regression based on L1/2 regularization
    XU Chen
    PENG ZhiMing
    JING WenFeng
    ScienceChina(InformationSciences), 2013, 56 (04) : 75 - 90
  • [9] Sparse Feature Grouping based on l1/2 Norm Regularization
    Mao, Wentao
    Xu, Wentao
    Li, Yuan
    2018 ANNUAL AMERICAN CONTROL CONFERENCE (ACC), 2018, : 1045 - 1051
  • [10] αl1 - βl2 regularization for sparse recovery
    Ding, Liang
    Han, Weimin
    INVERSE PROBLEMS, 2019, 35 (12)