Hybrid Explainable Artificial Intelligence Models for Targeted Metabolomics Analysis of Diabetic Retinopathy

被引:3
|
作者
Yagin, Fatma Hilal [1 ]
Colak, Cemil [1 ]
Algarni, Abdulmohsen [2 ]
Gormez, Yasin [3 ]
Guldogan, Emek [1 ]
Ardigo, Luca Paolo [4 ]
机构
[1] Inonu Univ, Fac Med, Dept Biostat & Med Informat, TR-44280 Malatya, Turkiye
[2] King Khalid Univ, Dept Comp Sci, Abha 61421, Saudi Arabia
[3] Sivas Cumhuriyet Univ, Fac Econ & Adm Sci, Dept Management Informat Syst, TR-58140 Sivas, Turkiye
[4] NLA Univ Coll, Dept Teacher Educ, N-0166 Oslo, Norway
关键词
diabetic retinopathy; targeted metabolomics; hybrid explainable artificial intelligence; explainable deep learning; biomarkers; INTERVENTIONS; COMPLICATIONS; DISEASE;
D O I
10.3390/diagnostics14131364
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Background: Diabetic retinopathy (DR) is a prevalent microvascular complication of diabetes mellitus, and early detection is crucial for effective management. Metabolomics profiling has emerged as a promising approach for identifying potential biomarkers associated with DR progression. This study aimed to develop a hybrid explainable artificial intelligence (XAI) model for targeted metabolomics analysis of patients with DR, utilizing a focused approach to identify specific metabolites exhibiting varying concentrations among individuals without DR (NDR), those with non-proliferative DR (NPDR), and individuals with proliferative DR (PDR) who have type 2 diabetes mellitus (T2DM). Methods: A total of 317 T2DM patients, including 143 NDR, 123 NPDR, and 51 PDR cases, were included in the study. Serum samples underwent targeted metabolomics analysis using liquid chromatography and mass spectrometry. Several machine learning models, including Support Vector Machines (SVC), Random Forest (RF), Decision Tree (DT), Logistic Regression (LR), and Multilayer Perceptrons (MLP), were implemented as solo models and in a two-stage ensemble hybrid approach. The models were trained and validated using 10-fold cross-validation. SHapley Additive exPlanations (SHAP) were employed to interpret the contributions of each feature to the model predictions. Statistical analyses were conducted using the Shapiro-Wilk test for normality, the Kruskal-Wallis H test for group differences, and the Mann-Whitney U test with Bonferroni correction for post-hoc comparisons. Results: The hybrid SVC + MLP model achieved the highest performance, with an accuracy of 89.58%, a precision of 87.18%, an F1-score of 88.20%, and an F-beta score of 87.55%. SHAP analysis revealed that glucose, glycine, and age were consistently important features across all DR classes, while creatinine and various phosphatidylcholines exhibited higher importance in the PDR class, suggesting their potential as biomarkers for severe DR. Conclusion: The hybrid XAI models, particularly the SVC + MLP ensemble, demonstrated superior performance in predicting DR progression compared to solo models. The application of SHAP facilitates the interpretation of feature importance, providing valuable insights into the metabolic and physiological markers associated with different stages of DR. These findings highlight the potential of hybrid XAI models combined with explainable techniques for early detection, targeted interventions, and personalized treatment strategies in DR management.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] EyeArt artificial intelligence analysis of diabetic retinopathy in retinal screening events
    Rita Vought
    Victoria Vought
    Megh Shah
    Bernard Szirth
    Neelakshi Bhagat
    International Ophthalmology, 2023, 43 : 4851 - 4859
  • [22] EyeArt artificial intelligence analysis of diabetic retinopathy in retinal screening events
    Vought, Rita
    Vought, Victoria
    Shah, Megh
    Szirth, Bernard
    Bhagat, Neelakshi
    INTERNATIONAL OPHTHALMOLOGY, 2023, 43 (12) : 4851 - 4859
  • [23] Non-targeted metabolomics and explainable artificial intelligence: Effects of processing and color on coniferyl aldehyde levels in Eucommiae cortex
    Pan, Yijing
    Ming, Kehong
    Guo, Dongmei
    Liu, Xinyue
    Deng, Chenxi
    Chi, Qingjia
    Liu, Xianqiong
    Wang, Chunli
    Xu, Kang
    FOOD CHEMISTRY, 2024, 460
  • [24] Explainable artificial intelligence models for mineral prospectivity mapping
    Renguang ZUO
    Qiuming CHENG
    Ying XU
    Fanfan YANG
    Yihui XIONG
    Ziye WANG
    Oliver PKREUZER
    ScienceChinaEarthSciences, 2024, 67 (09) : 2864 - 2875
  • [25] Explainable artificial intelligence models for mineral prospectivity mapping
    Zuo, Renguang
    Cheng, Qiuming
    Xu, Ying
    Yang, Fanfan
    Xiong, Yihui
    Wang, Ziye
    Kreuzer, Oliver P.
    SCIENCE CHINA-EARTH SCIENCES, 2024, 67 (09) : 2864 - 2875
  • [26] Explainable Artificial Intelligence (XAI) Surrogate Models for Chemical Process Design and Analysis
    Ko, Yuna
    Na, Jonggeol
    KOREAN CHEMICAL ENGINEERING RESEARCH, 2023, 61 (04): : 542 - 549
  • [27] eXplainable Artificial Intelligence (XAI) in aging clock models
    Kalyakulina, Alena
    Yusipov, Igor
    Moskalev, Alexey
    Franceschi, Claudio
    Ivanchenko, Mikhail
    AGEING RESEARCH REVIEWS, 2024, 93
  • [28] Deep learning models and the limits of explainable artificial intelligence
    Jens Christian Bjerring
    Jakob Mainz
    Lauritz Munch
    Asian Journal of Philosophy, 4 (1):
  • [29] A mental models approach for defining explainable artificial intelligence
    Michael Merry
    Pat Riddle
    Jim Warren
    BMC Medical Informatics and Decision Making, 21
  • [30] A mental models approach for defining explainable artificial intelligence
    Merry, Michael
    Riddle, Pat
    Warren, Jim
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2021, 21 (01)