A Study on Automatic O-RADS Classification of Sonograms of Ovarian Adnexal Lesions Based on Deep Convolutional Neural Networks

被引:0
|
作者
Liu, Tao [1 ]
Miao, Kuo [1 ]
Tan, Gaoqiang [1 ]
Bu, Hanqi [1 ]
Shao, Xiaohui [1 ]
Wang, Siming [1 ]
Dong, Xiaoqiu [1 ]
机构
[1] The Department of Ultrasound Medicine, Harbin Medical University Fourth Affiliated Hospital, Heilongjiang, Harbin, China
来源
Ultrasound in Medicine and Biology | 2025年 / 51卷 / 02期
关键词
Diagnosis - Sonochemistry - Ultrasonic applications;
D O I
10.1016/j.ultrasmedbio.2024.11.009
中图分类号
学科分类号
摘要
Objective: This study explored a new method for automatic O-RADS classification of sonograms based on a deep convolutional neural network (DCNN). Methods: A development dataset (DD) of 2,455 2D grayscale sonograms of 870 ovarian adnexal lesions and an intertemporal validation dataset (IVD) of 426 sonograms of 280 lesions were collected and classified according to O-RADS v2022 (categories 2–5) by three senior sonographers. Classification results verified by a two-tailed z-test to be consistent with the O-RADS v2022 malignancy rate indicated the diagnostic performance was comparable to that of a previous study and were used for training; otherwise, the classification was repeated by two different sonographers. The DD was used to develop three DCNN models (ResNet34, DenseNet121, and ConvNeXt-Tiny) that employed transfer learning techniques. Model performance was assessed for accuracy, precision, and F1 score, among others. The optimal model was selected and validated over time using the IVD and to analyze whether the efficiency of O-RADS classification was improved with the assistance of this model for three sonographers with different years of experience. Results: The proportion of malignant tumors in the DD and IVD in each O-RADS-defined risk category was verified using a two-tailed z-test. Malignant lesions (O-RADS categories 4 and 5) were diagnosed in the DD and IVD with sensitivities of 0.949 and 0.962 and specificities of 0.892 and 0.842, respectively. ResNet34, DenseNet121, and ConvNeXt-Tiny had overall accuracies of 0.737, 0.752, and 0.878, respectively, for sonogram prediction in the DD. The ConvNeXt-Tiny model's accuracy for sonogram prediction in the IVD was 0.859, with no significant difference between test sets. The modeling aid significantly reduced O-RADS classification time for three sonographers (Cohen's d = 5.75). Conclusion: ConvNeXt-Tiny showed robust and stable performance in classifying O-RADS 2–5, improving sonologists' classification efficacy. © 2024 World Federation for Ultrasound in Medicine & Biology
引用
收藏
页码:387 / 395
相关论文
共 50 条
  • [21] Automatic anatomical classification of colonoscopic images using deep convolutional neural networks
    Saito, Hiroaki
    Tanimoto, Tetsuya
    Ozawa, Tsuyoshi
    Ishihara, Soichiro
    Fujishiro, Mitsuhiro
    Shichijo, Satoki
    Hirasawa, Dai
    Matsuda, Tomoki
    Endo, Yuma
    Tada, Tomohiro
    GASTROENTEROLOGY REPORT, 2021, 9 (03): : 226 - 233
  • [22] Breast Tumor Classification Based on Deep Convolutional Neural Networks
    Bakkouri, Ibtissam
    Afdel, Karim
    2017 3RD INTERNATIONAL CONFERENCE ON ADVANCED TECHNOLOGIES FOR SIGNAL AND IMAGE PROCESSING (ATSIP), 2017, : 49 - 54
  • [23] CLASSIFICATION BASED ON DEEP CONVOLUTIONAL NEURAL NETWORKS WITH HYPERSPECTRAL IMAGE
    Zheng, Zezhong
    Zhang, Yameng
    Li, Liutong
    Zhu, Mingcang
    He, Yong
    Li, Minqi
    Guo, Zhengqiang
    He, Yue
    Yu, Zhenlu
    Yang, Xiaocheng
    Liu, Xin
    Luo, Jianhua
    Yang, Taoli
    Liu, Yalan
    Li, Jiang
    2017 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2017, : 1828 - 1831
  • [24] CLASSIFICATION BASED ON MISSING FEATURES IN DEEP CONVOLUTIONAL NEURAL NETWORKS
    Milosevic, N.
    Rackovic, M.
    NEURAL NETWORK WORLD, 2019, 29 (04) : 221 - 234
  • [25] Lung Nodule Classification Based on Deep Convolutional Neural Networks
    Mendoza Bobadilla, Julio Cesar
    Pedrini, Helio
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2016, 2017, 10125 : 117 - 124
  • [26] A convolutional and transformer based deep neural network for automatic modulation classification
    Ying, Shanchuan
    Huang, Sai
    Chang, Shuo
    Yang, Zheng
    Feng, Zhiyong
    Guo, Ningyan
    CHINA COMMUNICATIONS, 2023, 20 (05) : 135 - 147
  • [27] A Convolutional and Transformer Based Deep Neural Network for Automatic Modulation Classification
    Shanchuan Ying
    Sai Huang
    Shuo Chang
    Zheng Yang
    Zhiyong Feng
    Ningyan Guo
    China Communications, 2023, 20 (05) : 135 - 147
  • [28] Deep learning-based ovarian cyst classification and abnormality detection using convolutional neural networks
    Munish Sood
    Emjee Puthooran
    Nishant Jain
    Neural Computing and Applications, 2025, 37 (5) : 3047 - 3059
  • [29] Automatic Classification Framework of Tongue Feature Based on Convolutional Neural Networks
    Li, Jiawei
    Zhang, Zhidong
    Zhu, Xiaolong
    Zhao, Yunlong
    Ma, Yuhang
    Zang, Junbin
    Li, Bo
    Cao, Xiyuan
    Xue, Chenyang
    MICROMACHINES, 2022, 13 (04)
  • [30] Automatic detection and classification of protruding lesions in wireless capsule endoscopy images based on a deep convolutional neural network
    Saito, Hiroaki
    Aoki, Tomonori
    Aoyama, Kazuharu
    Kato, Yusuke
    Tsuboi, Akiyoshi
    Yamada, Atsuo
    Fujishiro, Mitsuhiro
    Oka, Shiro
    Ishihara, Soichiro
    Matsuda, Tomoki
    Nakahori, Masato
    Tanaka, Shinji
    Koike, Kazuhiko
    Tada, Tomohiro
    GASTROINTESTINAL ENDOSCOPY, 2020, 92 (01) : 144 - +