A novel method for financial distress prediction based on sparse neural networks with L1/2 regularization

被引:0
|
作者
Chen, Ying [1 ]
Guo, Jifeng [2 ]
Huang, Junqin [3 ]
Lin, Bin [3 ,4 ]
机构
[1] South China Normal Univ, Int Business Coll, Guangzhou 510631, Peoples R China
[2] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
[3] Sun Yat Sen Univ, Sch Business, Guangzhou 510275, Peoples R China
[4] Guangdong Ind Polytech, Guangzhou 510300, Peoples R China
基金
中国国家自然科学基金;
关键词
Financial distress prediction; Features selection; Sparse neural networks; L-1/2; regularization; BANKRUPTCY PREDICTION; DISCRIMINANT-ANALYSIS; FIRMS; SELECTION; RATIOS; REGRESSION; ABILITY; RISK;
D O I
10.1007/s13042-022-01566-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Corporate financial distress is related to the interests of the enterprise and stakeholders. Therefore, its accurate prediction is of great significance to avoid huge losses from them. Despite significant effort and progress in this field, the existing prediction methods are either limited by the number of input variables or restricted to those financial predictors. To alleviate those issues, both financial variables and non-financial variables are screened out from the existing accounting and finance theory to use as financial distress predictors. In addition, a novel method for financial distress prediction (FDP) based on sparse neural networks is proposed, namely FDP-SNN, in which the weight of the hidden layer is constrained with L-1/2 regularization to achieve the sparsity, so as to select relevant and important predictors, improving the predicted accuracy. It also provides support for the interpretability of the model. The results show that non-financial variables, such as investor protection and governance structure, play a key role in financial distress prediction than those financial ones, especially when the forecast period grows longer. By comparing those classic models proposed by predominant researchers in accounting and finance, the proposed model outperforms in terms of accuracy, precision, and AUC performance.
引用
收藏
页码:2089 / 2103
页数:15
相关论文
共 50 条
  • [21] Sparse possibilistic clustering with L1 regularization
    Inokuchi, Ryo
    Miyamoto, Sadaaki
    GRC: 2007 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING, PROCEEDINGS, 2007, : 442 - 445
  • [22] A New Conjugate Gradient Method with Smoothing L1/2 Regularization Based on a Modified Secant Equation for Training Neural Networks
    Li, Wenyu
    Liu, Yan
    Yang, Jie
    Wu, Wei
    NEURAL PROCESSING LETTERS, 2018, 48 (02) : 955 - 978
  • [23] A novel l1/2 sparse regression method for hyperspectral unmixing
    Sun, Le
    Wu, Zebin
    Xiao, Liang
    Liu, Jianjun
    Wei, Zhihui
    Dang, Fuxing
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2013, 34 (20) : 6983 - 7001
  • [24] Smooth group L1/2 regularization for input layer of feedforward neural networks
    Li, Feng
    Zurada, Jacek M.
    Wu, Wei
    NEUROCOMPUTING, 2018, 314 : 109 - 119
  • [25] Structured Pruning of Convolutional Neural Networks via L1 Regularization
    Yang, Chen
    Yang, Zhenghong
    Khattak, Abdul Mateen
    Yang, Liu
    Zhang, Wenxin
    Gao, Wanlin
    Wang, Minjuan
    IEEE ACCESS, 2019, 7 : 106385 - 106394
  • [26] SPARSE REPRESENTATION LEARNING OF DATA BY AUTOENCODERS WITH L1/2 REGULARIZATION
    Li, F.
    Zurada, J. M.
    Wu, W.
    NEURAL NETWORK WORLD, 2018, 28 (02) : 133 - 147
  • [27] A novel method for financial distress prediction based on sparse neural networks with L1/2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L_{1/2}$$\end{document} regularization
    Ying Chen
    Jifeng Guo
    Junqin Huang
    Bin Lin
    International Journal of Machine Learning and Cybernetics, 2022, 13 (7) : 2089 - 2103
  • [28] Sparse Gabor Time-Frequency Representation Based on l1/2-l2 Regularization
    Li, Rui
    Zhou, Jian
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2019, 38 (10) : 4700 - 4722
  • [29] l1/2,1 group sparse regularization for compressive sensing
    Liu, Shengcai
    Zhang, Jiangshe
    Liu, Junmin
    Yin, Qingyan
    SIGNAL IMAGE AND VIDEO PROCESSING, 2016, 10 (05) : 861 - 868
  • [30] Congestion Control of Wireless Sensor Networks based on L1/2 Regularization
    Jin, Xin
    Yang, Yang
    Ma, Jinrong
    Li, Zhenxing
    PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 2436 - 2441