Risk Prediction of Diabetic Foot Amputation Using Machine Learning and Explainable Artificial Intelligence

被引:2
|
作者
Oei, Chien Wei [1 ,2 ]
Chan, Yam Meng [3 ]
Zhang, Xiaojin [1 ]
Leo, Kee Hao [1 ]
Yong, Enming [3 ]
Chong, Rhan Chaen [3 ]
Hong, Qiantai [3 ]
Zhang, Li [3 ]
Pan, Ying [3 ]
Tan, Glenn Wei Leong [3 ]
Mak, Malcolm Han Wen [3 ]
机构
[1] Tan Tock Seng Hosp, Management Informat Dept, Off Clin Epidemiol Analyt & Knowledge, Singapore, Singapore
[2] Nanyang Technol Univ, Dept Mech & Aerosp Engn, Singapore, Singapore
[3] Tan Tock Seng Hosp, Dept Gen Surg, Vasc Surg Serv, Singapore 308433, Singapore
关键词
diabetes; diabetic foot ulcer; lower extremity amputation; machine learning; model explainability; SHapley Additive exPlanations; wounds; MANAGEMENT; VARIANCE; SURVIVAL; MODELS; RATES;
D O I
10.1177/19322968241228606
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Background: Diabetic foot ulcers (DFUs) are serious complications of diabetes which can lead to lower extremity amputations (LEAs). Risk prediction models can identify high-risk patients who can benefit from early intervention. Machine learning (ML) methods have shown promising utility in medical applications. Explainable modeling can help its integration and acceptance. This study aims to develop a risk prediction model using ML algorithms with explainability for LEA in DFU patients.Methods: This study is a retrospective review of 2559 inpatient DFU episodes in a tertiary institution from 2012 to 2017. Fifty-one features including patient demographics, comorbidities, medication, wound characteristics, and laboratory results were reviewed. Outcome measures were the risk of major LEA, minor LEA and any LEA. Machine learning models were developed for each outcome, with model performance evaluated using receiver operating characteristic (ROC) curves, balanced-accuracy and F1-score. SHapley Additive exPlanations (SHAP) was applied to interpret the model for explainability.Results: Model performance for prediction of major, minor, and any LEA event achieved ROC of 0.820, 0.637, and 0.756, respectively, with XGBoost, XGBoost, and Gradient Boosted Trees algorithms demonstrating best results for each model, respectively. Using SHAP, key features that contributed to the predictions were identified for explainability. Total white cell (TWC) count, comorbidity score and red blood cell count contributed highest weightage to major LEA event. Total white cell, eosinophils, and necrotic eschar in the wound contributed most to any LEA event.Conclusions: Machine learning algorithms performed well in predicting the risk of LEA in a patient with DFU. Explainability can help provide clinical insights and identify at-risk patients for early intervention.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Application of artificial intelligence and machine learning for prediction of oral cancer risk
    Alhazmi, Anwar
    Alhazmi, Yaser
    Makrami, Ali
    Masmali, Amal
    Salawi, Nourah
    Masmali, Khulud
    Patil, Shankargouda
    JOURNAL OF ORAL PATHOLOGY & MEDICINE, 2021, 50 (05) : 444 - 450
  • [22] RMSxAI: arginine methylation sites prediction from protein sequences using machine learning algorithms and explainable artificial intelligence
    Dwivedi, Gaurav
    Khandelwal, Monika
    Rout, Ranjeet Kumar
    Umer, Saiyed
    Mallik, Saurav
    Qin, Hong
    DISCOVER APPLIED SCIENCES, 2024, 6 (07)
  • [23] Machine Learning and Explainable Artificial Intelligence Using Counterfactual Explanations for Evaluating Posture Parameters
    Dindorf, Carlo
    Ludwig, Oliver
    Simon, Steven
    Becker, Stephan
    Froehlich, Michael
    BIOENGINEERING-BASEL, 2023, 10 (05):
  • [24] Software Defects Identification: Results Using Machine Learning and Explainable Artificial Intelligence Techniques
    Begum, Momotaz
    Shuvo, Mehedi Hasan
    Ashraf, Imran
    Al Mamun, Abdullah
    Uddin, Jia
    Samad, Md Abdus
    IEEE ACCESS, 2023, 11 : 132750 - 132765
  • [25] EXPLAINABLE ARTIFICIAL INTELLIGENCE FOR EARLY PREDICTION OF PRESSURE INJURY RISK
    Alderden, Jenny
    Johnny, Jace
    Brooks, Katie R.
    Wilson, Andrew
    Yap, Tracey L.
    Zhao, Yunchuan
    van der Laan, Mark
    Kennerly, Susan
    AMERICAN JOURNAL OF CRITICAL CARE, 2024, 33 (05) : 373 - 381
  • [26] An explainable artificial intelligence framework for risk prediction of COPD in smokers
    Xuchun Wang
    Yuchao Qiao
    Yu Cui
    Hao Ren
    Ying Zhao
    Liqin Linghu
    Jiahui Ren
    Zhiyang Zhao
    Limin Chen
    Lixia Qiu
    BMC Public Health, 23
  • [27] An explainable artificial intelligence framework for risk prediction of COPD in smokers
    Wang, Xuchun
    Qiao, Yuchao
    Cui, Yu
    Ren, Hao
    Zhao, Ying
    Linghu, Liqin
    Ren, Jiahui
    Zhao, Zhiyang
    Chen, Limin
    Qiu, Lixia
    BMC PUBLIC HEALTH, 2023, 23 (01)
  • [28] Artificial Intelligence for Risk Prediction: A Simple, Explainable Machine Learning Model Predicts Incident Dysplasia or Malignancy in Barrett's Esophagus
    Rao, Ashwin
    Haydel, Jasmine
    Ma, Samuel
    Nguyen-Wenker, Theresa
    El-Serag, Hashem
    AMERICAN JOURNAL OF GASTROENTEROLOGY, 2024, 119 (10S): : S404 - S405
  • [29] Intelligent skin disease prediction system using transfer learning and explainable artificial intelligence
    Abbas, Sagheer
    Ahmed, Fahad
    Khan, Wasim Ahmad
    Ahmad, Munir
    Khan, Muhammad Adnan
    Ghazal, Taher M.
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [30] Machine learning, artificial intelligence and the prediction of dementia
    Merkin, Alexander
    Krishnamurthi, Rita
    Medvedev, Oleg N.
    CURRENT OPINION IN PSYCHIATRY, 2022, 35 (02) : 123 - 129