CoMB-Deep: Composite Deep Learning-Based Pipeline for Classifying Childhood Medulloblastoma and Its Classes

被引:31
|
作者
Attallah, Omneya [1 ]
机构
[1] Arab Acad Sci Technol & Maritime Transport, Coll Engn & Technol, Dept Elect & Commun Engn, Alexandria, Egypt
关键词
childhood medulloblastoma; histopathology; computer-aided diagnosis; convolutional neural network; long short term memory; FEATURE-SELECTION; BRAIN-TUMORS; IMAGE CLASSIFICATION; CURRENT MANAGEMENT; FEATURE-EXTRACTION; FUTURE; EPIDEMIOLOGY; RECOGNITION; INSIGHTS; SYSTEM;
D O I
10.3389/fninf.2021.663592
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Childhood medulloblastoma (MB) is a threatening malignant tumor affecting children all over the globe. It is believed to be the foremost common pediatric brain tumor causing death. Early and accurate classification of childhood MB and its classes are of great importance to help doctors choose the suitable treatment and observation plan, avoid tumor progression, and lower death rates. The current gold standard for diagnosing MB is the histopathology of biopsy samples. However, manual analysis of such images is complicated, costly, time-consuming, and highly dependent on the expertise and skills of pathologists, which might cause inaccurate results. This study aims to introduce a reliable computer-assisted pipeline called CoMB-Deep to automatically classify MB and its classes with high accuracy from histopathological images. This key challenge of the study is the lack of childhood MB datasets, especially its four categories (defined by the WHO) and the inadequate related studies. All relevant works were based on either deep learning (DL) or textural analysis feature extractions. Also, such studies employed distinct features to accomplish the classification procedure. Besides, most of them only extracted spatial features. Nevertheless, CoMB-Deep blends the advantages of textural analysis feature extraction techniques and DL approaches. The CoMB-Deep consists of a composite of DL techniques. Initially, it extracts deep spatial features from 10 convolutional neural networks (CNNs). It then performs a feature fusion step using discrete wavelet transform (DWT), a texture analysis method capable of reducing the dimension of fused features. Next, the CoMB-Deep explores the best combination of fused features, enhancing the performance of the classification process using two search strategies. Afterward, it employs two feature selection techniques on the fused feature sets selected in the previous step. A bi-directional long-short term memory (Bi-LSTM) network; a DL-based approach that is utilized for the classification phase. CoMB-Deep maintains two classification categories: binary category for distinguishing between the abnormal and normal cases and multi-class category to identify the subclasses of MB. The results of the CoMB-Deep for both classification categories prove that it is reliable. The results also indicate that the feature sets selected using both search strategies have enhanced the performance of Bi-LSTM compared to individual spatial deep features. CoMB-Deep is compared to related studies to verify its competitiveness, and this comparison confirmed its robustness and outperformance. Hence, CoMB-Deep can help pathologists perform accurate diagnoses, reduce misdiagnosis risks that could occur with manual diagnosis, accelerate the classification procedure, and decrease diagnosis costs.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Deep Learning-Based Damage, Load and Support Identification for a Composite Pipeline by Extracting Modal Macro Strains from Dynamic Excitations
    Zhao, Ying
    Noori, Mohammad
    Altabey, Wael A.
    Ghiasi, Ramin
    Wu, Zhishen
    APPLIED SCIENCES-BASEL, 2018, 8 (12):
  • [42] Mesh/membrane composite with superior mechanical performance: A deep learning-based design
    Zhang, Yunce
    Tao, Qiang
    Liu, Yuanpeng
    Wang, Changguo
    COMPOSITES SCIENCE AND TECHNOLOGY, 2022, 230
  • [43] Deep Learning-Based Pipeline for Automatic Identification of Intestinal Regenerating Crypts in Mouse Histological Images
    Fu, J.
    Jiang, H.
    Melemenidis, S.
    Viswanathan, V.
    Dutt, S.
    Lau, B.
    Soto, L. A.
    Manjappa, R.
    Skinner, L.
    Yu, S. J.
    Surucu, M.
    Graves, E. E.
    Casey, K.
    Rankin, E.
    Lu, W.
    Loo, B. W., Jr.
    Gut, X.
    INTERNATIONAL JOURNAL OF RADIATION ONCOLOGY BIOLOGY PHYSICS, 2023, 117 (02): : S117 - S118
  • [44] Deep Learning-Based Deep Brain Stimulation Targeting and Clinical Applications
    Park, Seong-Cheol
    Cha, Joon Hyuk
    Lee, Seonhwa
    Jang, Wooyoung
    Lee, Chong Sik
    Lee, Jung Kyo
    FRONTIERS IN NEUROSCIENCE, 2019, 13
  • [45] Deep learning-based automatic pipeline for quantitative assessment of thigh muscle morphology and fatty infiltration
    Gaj, Sibaji
    Eck, Brendan L.
    Xie, Dongxing
    Lartey, Richard
    Lo, Charlotte
    Zaylor, William
    Yang, Mingrui
    Nakamura, Kunio
    Winalski, Carl S.
    Spindler, Kurt P.
    Li, Xiaojuan
    MAGNETIC RESONANCE IN MEDICINE, 2023, 89 (06) : 2441 - 2455
  • [46] Review of Deep Learning-Based Personalized Learning Recommendation
    Zhong, Ling
    Wei, Yantao
    Yao, Huang
    Deng, Wei
    Wang, Zhifeng
    Tong, Mingwen
    2020 11TH INTERNATIONAL CONFERENCE ON E-EDUCATION, E-BUSINESS, E-MANAGEMENT, AND E-LEARNING (IC4E 2020), 2020, : 145 - 149
  • [47] Deep learning-based NLP data pipeline for EHR-scanned document information extraction
    Hsu, Enshuo
    Malagaris, Ioannis
    Kuo, Yong-Fang
    Sultana, Rizwana
    Roberts, Kirk
    JAMIA OPEN, 2022, 5 (02)
  • [48] Multitask Learning-Based Pipeline-Parallel Computation Offloading Architecture for Deep Face Analysis
    Alghareb, Faris S.
    Hasan, Balqees Talal
    COMPUTERS, 2025, 14 (01)
  • [49] Paracell: A high throughput, deep learning-based pipeline for single-cell phenotypic profiling
    Nguyen, David L.
    Chao, Jesse T.
    CANCER RESEARCH, 2024, 84 (06)
  • [50] Hyperparameter Learning for Deep Learning-Based Recommender Systems
    Wu, Di
    Sun, Bo
    Shang, Mingsheng
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (04) : 2699 - 2712