Predicting the diabetic foot in the population of type 2 diabetes mellitus from tongue images and clinical information using multi-modal deep learning

被引:1
|
作者
Tian, Zhikui [1 ]
Wang, Dongjun [2 ]
Sun, Xuan [3 ]
Cui, Chuan [4 ]
Wang, Hongwu [5 ]
机构
[1] Qilu Med Univ, Sch Rehabil Med, Zibo, Shandong, Peoples R China
[2] North China Univ Sci & Technol, Coll Tradit Chinese Med, Tangshan, Peoples R China
[3] Binzhou Med Univ, Coll Tradit Chinese Med, Yantai, Shandong, Peoples R China
[4] Qilu Med Univ, Sch Clin Med, Zibo, Shandong, Peoples R China
[5] Tianjin Univ Tradit Chinese Med, Sch Hlth Sci & Engn, Tianjin, Peoples R China
关键词
diabetic foot; tongue features; objectified parameters; prediction model; machine learning; AMPUTATION; SKIN; PREVENTION; MANAGEMENT; HARDNESS; ULCER; LIFE;
D O I
10.3389/fphys.2024.1473659
中图分类号
Q4 [生理学];
学科分类号
071003 ;
摘要
Aims Based on the quantitative and qualitative fusion data of traditional Chinese medicine (TCM) and Western medicine, a diabetic foot (DF) prediction model was established through combining the objectified parameters of TCM and Western medicine.Methods The ResNet-50 deep neural network (DNN) was used to extract depth features of tongue demonstration, and then a fully connected layer (FCL) was used for feature extraction to obtain aggregate features. Finally, a non-invasive DF prediction model based on tongue features was realized.Results Among the 391 patients included, there were 267 DF patients, with their BMI (25.2 vs. 24.2) and waist-to-hip ratio (0.953 vs. 0.941) higher than those of type 2 diabetes mellitus (T2DM) group. The diabetes (15 years vs. 8 years) and hypertension durations (10 years vs. 7.5 years) in DF patients were significantly higher than those in T2DM group. Moreover, the plantar hardness in DF patients was higher than that in T2DM patients. The accuracy and sensitivity of the multi-mode DF prediction model reached 0.95 and 0.9286, respectively.Conclusion We established a DF prediction model based on clinical features and objectified tongue color, which showed the unique advantages and important role of objectified tongue demonstration in the DF risk prediction, thus further proving the scientific nature of TCM tongue diagnosis. Based on the qualitative and quantitative fusion data, we combined tongue images with DF indicators to establish a multi-mode DF prediction model, in which tongue demonstration and objectified foot data can correct the subjectivity of prior knowledge. The successful establishment of the feature fusion diagnosis model can demonstrate the clinical practical value of objectified tongue demonstration. According to the results, the model had better performance to distinguish between T2DM and DF, and by comparing the performance of the model with and without tongue images, it was found that the model with tongue images performed better.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Mining the interpretable prognostic features from pathological image of intrahepatic cholangiocarcinoma using multi-modal deep learning
    Ding, Guang-Yu
    Tan, Wei-Min
    Lin, You-Pei
    Ling, Yu
    Huang, Wen
    Zhang, Shu
    Shi, Jie-Yi
    Luo, Rong-Kui
    Ji, Yuan
    Wang, Xiao-Ying
    Zhou, Jian
    Fan, Jia
    Cai, Mu-Yan
    Yan, Bo
    Gao, Qiang
    BMC MEDICINE, 2024, 22 (01):
  • [42] Offshore Oil Slick Detection: From Photo-Interpreter to Explainable Multi-Modal Deep Learning Models Using SAR Images and Contextual Data
    Amri, Emna
    Dardouillet, Pierre
    Benoit, Alexandre
    Courteille, Hermann
    Bolon, Philippe
    Dubucq, Dominique
    Credoz, Anthony
    REMOTE SENSING, 2022, 14 (15)
  • [43] Predicting treatment response from longitudinal images using multi-task deep learning
    Cheng Jin
    Heng Yu
    Jia Ke
    Peirong Ding
    Yongju Yi
    Xiaofeng Jiang
    Xin Duan
    Jinghua Tang
    Daniel T. Chang
    Xiaojian Wu
    Feng Gao
    Ruijiang Li
    Nature Communications, 12
  • [44] Predicting treatment response from longitudinal images using multi-task deep learning
    Jin, Cheng
    Yu, Heng
    Ke, Jia
    Ding, Peirong
    Yi, Yongju
    Jiang, Xiaofeng
    Duan, Xin
    Tang, Jinghua
    Chang, Daniel T.
    Wu, Xiaojian
    Gao, Feng
    Li, Ruijiang
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [45] Predicting progression to referable diabetic retinopathy from retinal images and screening data using deep learning
    Nderitu, Paul
    do Rio, Joan Nunez
    Webster, Laura
    Mann, Samantha
    Hopkins, David
    Cardoso, Jorge
    Modat, Marc
    Bergeles, Christos
    Jackson, Timothy L.
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2022, 63 (07)
  • [46] Multi-center study on predicting breast cancer lymph node status from core needle biopsy specimens using multi-modal and multi-instance deep learning
    Ding, Yan
    Yang, Fan
    Han, Mengxue
    Li, Chunhui
    Wang, Yanan
    Xu, Xin
    Zhao, Min
    Zhao, Meng
    Yue, Meng
    Deng, Huiyan
    Yang, Huichai
    Yao, Jianhua
    Liu, Yueping
    NPJ BREAST CANCER, 2023, 9 (01)
  • [47] Multi-center study on predicting breast cancer lymph node status from core needle biopsy specimens using multi-modal and multi-instance deep learning
    Yan Ding
    Fan Yang
    Mengxue Han
    Chunhui Li
    Yanan Wang
    Xin Xu
    Min Zhao
    Meng Zhao
    Meng Yue
    Huiyan Deng
    Huichai Yang
    Jianhua Yao
    Yueping Liu
    npj Breast Cancer, 9
  • [48] Explainable Multi-Modal Deep Learning With Cross-Modal Attention for Diagnosis of Dyssynergic Defecation Using Abdominal X-Ray Images and Symptom Questionnaire
    Sangnark, Sirapob
    Rattanachaisit, Pakkapon
    Patcharatrakul, Tanisa
    Vateekul, Peerapon
    IEEE ACCESS, 2024, 12 : 78132 - 78147
  • [49] Automatic Quantification of Tumour Hypoxia From Multi-Modal Microscopy Images Using Weakly-Supervised Learning Methods
    Carneiro, Gustavo
    Peng, Tingying
    Bayer, Christine
    Navab, Nassir
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2017, 36 (07) : 1405 - 1417
  • [50] A diagnosis model for brain atrophy using deep learning and MRI of type 2 diabetes mellitus
    Syed, Saba Raoof
    Durai, M. A. Saleem
    FRONTIERS IN NEUROSCIENCE, 2023, 17