An explainable Artificial Intelligence software system for predicting diabetes

被引:4
|
作者
Srinivasu, Parvathaneni Naga [1 ,2 ]
Ahmed, Shakeel [3 ]
Hassaballah, Mahmoud [4 ,5 ]
Almusallam, Naif [6 ]
机构
[1] Univ Fed Ceara, Dept Teleinformat Engn, BR-60455970 Fortaleza, Brazil
[2] Amrita Vishwa Vidyapeetham, Amrita Sch Comp, Amaravati 522503, Andhra Pradesh, India
[3] King Faisal Univ, Coll Comp Sci & Informat Technol, Dept Comp Sci, Al Hasa 31982, Saudi Arabia
[4] Prince Sattam Bin Abdulaziz Univ, Coll Comp Engn & Sci, Dept Comp Sci, Al Kharj 16278, Saudi Arabia
[5] South Valley Univ, Fac Comp & Informat, Dept Comp Sci, Qena, Egypt
[6] King Faisal Univ, Coll Business Adm, Dept Management Informat Syst, Al Hasa 31982, Saudi Arabia
关键词
Convolutional neural networks; Bi-LSTM; Blood glucose levels; Spectrogram images; Hyperparameters; ROC curves; DIAGNOSIS; CLASSIFIER; AI;
D O I
10.1016/j.heliyon.2024.e36112
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Implementing diabetes surveillance systems is paramount to mitigate the risk of incurring substantial medical expenses. Currently, blood glucose is measured by minimally invasive methods, which involve extracting a small blood sample and transmitting it to a blood glucose meter. This method is deemed discomforting for individuals who are undergoing it. The present study introduces an Explainable Artificial Intelligence (XAI) system, which aims to create an intelligible machine capable of explaining expected outcomes and decision models. To this end, we analyze abnormal glucose levels by utilizing Bi-directional Long Short-Term Memory (Bi-LSTM) and Convolutional Neural Network (CNN). In this regard, the glucose levels are acquired through the glucose oxidase (GOD) strips placed over the human body. Later, the signal data is converted to the spectrogram images, classified as low glucose, average glucose, and abnormal glucose levels. The labeled spectrogram images are then used to train the individualized monitoring model. The proposed XAI model to track real-time glucose levels uses the XAI-driven architecture in its feature processing. The model's effectiveness is evaluated by analyzing the performance of the proposed model and several evolutionary metrics used in the confusion matrix. The data revealed in the study demonstrate that the proposed model effectively identifies individuals with elevated glucose levels.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] A Review of Explainable Artificial Intelligence
    Lin, Kuo-Yi
    Liu, Yuguang
    Li, Li
    Dou, Runliang
    ADVANCES IN PRODUCTION MANAGEMENT SYSTEMS: ARTIFICIAL INTELLIGENCE FOR SUSTAINABLE AND RESILIENT PRODUCTION SYSTEMS, APMS 2021, PT IV, 2021, 633 : 574 - 584
  • [22] Explainable Artificial Intelligence for Cybersecurity
    Sharma, Deepak Kumar
    Mishra, Jahanavi
    Singh, Aeshit
    Govil, Raghav
    Srivastava, Gautam
    Lin, Jerry Chun-Wei
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 103
  • [23] Explainable Artificial Intelligence: A Survey
    Dosilovic, Filip Karlo
    Brcic, Mario
    Hlupic, Nikica
    2018 41ST INTERNATIONAL CONVENTION ON INFORMATION AND COMMUNICATION TECHNOLOGY, ELECTRONICS AND MICROELECTRONICS (MIPRO), 2018, : 210 - 215
  • [24] Explainable Artificial Intelligence (XAI) for the Prediction of Diabetes Management: An Ensemble Approach
    Ganguly, Rita
    Singh, Dharmpal
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (07) : 158 - 163
  • [25] EvidenceQuest: An Interactive Evidence Discovery System for Explainable Artificial Intelligence
    Hanif, Ambreen
    Beheshti, Amin
    Zhang, Xuyun
    Wood, Steven
    Benatallah, Boualem
    Foo, Eu Jin
    PROCEEDINGS OF THE 17TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, WSDM 2024, 2024, : 1058 - 1061
  • [26] Predicting Meibomian Gland Dropout and Feature Importance Analysis with Explainable Artificial Intelligence
    Fineide, Fredrik A.
    Storas, Andrea M.
    Riegler, Michael A.
    Utheim, Tor Paaske
    2023 IEEE 36TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS, CBMS, 2023, : 366 - 373
  • [27] Predicting coronavirus disease 2019 severity using explainable artificial intelligence techniques
    Ozawa, Takuya
    Chubachi, Shotaro
    Namkoong, Ho
    Nemoto, Shota
    Ikegami, Ryo
    Asakura, Takanori
    Tanaka, Hiromu
    Lee, Ho
    Fukushima, Takahiro
    Azekawa, Shuhei
    Otake, Shiro
    Nakagawara, Kensuke
    Watase, Mayuko
    Masaki, Katsunori
    Kamata, Hirofumi
    Harada, Norihiro
    Ueda, Tetsuya
    Ueda, Soichiro
    Ishiguro, Takashi
    Arimura, Ken
    Saito, Fukuki
    Yoshiyama, Takashi
    Nakano, Yasushi
    Muto, Yoshikazu
    Suzuki, Yusuke
    Edahiro, Ryuya
    Murakami, Koji
    Sato, Yasunori
    Okada, Yukinori
    Koike, Ryuji
    Ishii, Makoto
    Hasegawa, Naoki
    Kitagawa, Yuko
    Tokunaga, Katsushi
    Kimura, Akinori
    Miyano, Satoru
    Ogawa, Seishi
    Kanai, Takanori
    Fukunaga, Koichi
    Imoto, Seiya
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [28] Explainable Artificial Intelligence for Predicting Attention Deficit Hyperactivity Disorder in Children and Adults
    Namasse, Zineb
    Tabaa, Mohamed
    Hidila, Zineb
    Mouchawrab, Samar
    HEALTHCARE, 2025, 13 (02)
  • [29] An explainable artificial intelligence and Internet of Things framework for monitoring and predicting cardiovascular disease
    Umar, Mubarak Albarka
    Abuali, Najah
    Shuaib, Khaled
    Awad, Ali Ismail
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 144
  • [30] Explainable artificial intelligence on life satisfaction, diabetes mellitus and its comorbid condition
    Ranyeong Kim
    Chae-Won Kim
    Hyuntae Park
    Kwang-Sig Lee
    Scientific Reports, 13