Cost-sensitive stacking ensemble learning for company financial distress prediction

被引:0
|
作者
Wang S. [1 ]
Chi G. [1 ]
机构
[1] School of Economics and Management, Dalian University of Technology, No. 2 Linggong Road, Liaoning Province, Dalian City
基金
中国国家自然科学基金;
关键词
Cost-sensitive; Ensemble learning; Financial distress prediction; Stacking;
D O I
10.1016/j.eswa.2024.124525
中图分类号
学科分类号
摘要
Financial distress prediction (FDP) is a topic that has received wide attention in the finance sector and data mining field. Applications of combining cost-sensitive learning with classification models to address the FDP problem have been intensely attracted. However, few combined cost-sensitive learning and Stacking to predict financial distress. In this article, a cost-sensitive learning method for FDP, namely cost-sensitive stacking (CSStacking), is put forward. In this work, a two-phase feature selection method is used to select the optimal feature subset. A CSStacking ensemble model is developed with selected features to make a final prediction. The paired T test and non-parametric Wilcoxon test are employed to check the significant differences between CSStacking and benchmark models. An experiment over Chinese listed company dataset is designed to investigate the effectiveness of CSStacking. The experimental results prove that CSStacking can forecast listed companies’ financial distress five years ahead and improves the identification rate of financially distressed companies, highlighting its potential to reduce economic losses caused by misclassifying financially distressed companies. The results of comparing CSStacking with four types of benchmark models show that CSStacking performs significantly better than benchmark models. Furthermore, the findings illustrate that “asset-liability ratio”, “current ratio”, “quick ratio”, and “industry prosperity index” are critical variables in predicting financial distress for Chinese listed companies. © 2024 Elsevier Ltd
引用
收藏
相关论文
共 50 条
  • [41] Adversarial Learning With Cost-Sensitive Classes
    Shen, Haojing
    Chen, Sihong
    Wang, Ran
    Wang, Xizhao
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (08) : 4855 - 4866
  • [42] Cost-Sensitive Decision Tree Learning
    Vadera, Sunil
    PROCEEDINGS 2019 AMITY INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (AICAI), 2019, : 4 - 5
  • [43] Cost-sensitive positive and unlabeled learning
    Chen, Xiuhua
    Gong, Chen
    Yang, Jian
    INFORMATION SCIENCES, 2021, 558 : 229 - 245
  • [44] Robust SVM for Cost-Sensitive Learning
    Jiangzhang Gan
    Jiaye Li
    Yangcai Xie
    Neural Processing Letters, 2022, 54 : 2737 - 2758
  • [45] Cost-Sensitive Action Model Learning
    Rao, Dongning
    Jiang, Zhihua
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2016, 24 (02) : 167 - 193
  • [46] Cost-sensitive selection of variables by ensemble of model sequences
    Yan, Donghui
    Qin, Zhiwei
    Gu, Songxiang
    Xu, Haiping
    Shao, Ming
    KNOWLEDGE AND INFORMATION SYSTEMS, 2021, 63 (05) : 1069 - 1092
  • [47] Cost-sensitive ensemble classification algorithm for medical image
    Zhang, Minghui
    Pan, Haiwei
    Zhang, Niu
    Xie, Xiaoqin
    Zhang, Zhiqiang
    Feng, Xiaoning
    INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2018, 16 (03) : 282 - 288
  • [48] Roulette sampling for cost-sensitive learning
    Sheng, Victor S.
    Ling, Charles X.
    MACHINE LEARNING: ECML 2007, PROCEEDINGS, 2007, 4701 : 724 - +
  • [49] Robust SVM for Cost-Sensitive Learning
    Gan, Jiangzhang
    Li, Jiaye
    Xie, Yangcai
    NEURAL PROCESSING LETTERS, 2022, 54 (04) : 2737 - 2758
  • [50] Cost-sensitive learning for defect escalation
    Sheng, Victor S.
    Gu, Bin
    Fang, Wei
    Wu, Jian
    KNOWLEDGE-BASED SYSTEMS, 2014, 66 : 146 - 155