A Study on Automatic O-RADS Classification of Sonograms of Ovarian Adnexal Lesions Based on Deep Convolutional Neural Networks

被引:0
|
作者
Liu, Tao [1 ]
Miao, Kuo [1 ]
Tan, Gaoqiang [1 ]
Bu, Hanqi [1 ]
Shao, Xiaohui [1 ]
Wang, Siming [1 ]
Dong, Xiaoqiu [1 ]
机构
[1] The Department of Ultrasound Medicine, Harbin Medical University Fourth Affiliated Hospital, Heilongjiang, Harbin, China
来源
Ultrasound in Medicine and Biology | 2025年 / 51卷 / 02期
关键词
Diagnosis - Sonochemistry - Ultrasonic applications;
D O I
10.1016/j.ultrasmedbio.2024.11.009
中图分类号
学科分类号
摘要
Objective: This study explored a new method for automatic O-RADS classification of sonograms based on a deep convolutional neural network (DCNN). Methods: A development dataset (DD) of 2,455 2D grayscale sonograms of 870 ovarian adnexal lesions and an intertemporal validation dataset (IVD) of 426 sonograms of 280 lesions were collected and classified according to O-RADS v2022 (categories 2–5) by three senior sonographers. Classification results verified by a two-tailed z-test to be consistent with the O-RADS v2022 malignancy rate indicated the diagnostic performance was comparable to that of a previous study and were used for training; otherwise, the classification was repeated by two different sonographers. The DD was used to develop three DCNN models (ResNet34, DenseNet121, and ConvNeXt-Tiny) that employed transfer learning techniques. Model performance was assessed for accuracy, precision, and F1 score, among others. The optimal model was selected and validated over time using the IVD and to analyze whether the efficiency of O-RADS classification was improved with the assistance of this model for three sonographers with different years of experience. Results: The proportion of malignant tumors in the DD and IVD in each O-RADS-defined risk category was verified using a two-tailed z-test. Malignant lesions (O-RADS categories 4 and 5) were diagnosed in the DD and IVD with sensitivities of 0.949 and 0.962 and specificities of 0.892 and 0.842, respectively. ResNet34, DenseNet121, and ConvNeXt-Tiny had overall accuracies of 0.737, 0.752, and 0.878, respectively, for sonogram prediction in the DD. The ConvNeXt-Tiny model's accuracy for sonogram prediction in the IVD was 0.859, with no significant difference between test sets. The modeling aid significantly reduced O-RADS classification time for three sonographers (Cohen's d = 5.75). Conclusion: ConvNeXt-Tiny showed robust and stable performance in classifying O-RADS 2–5, improving sonologists' classification efficacy. © 2024 World Federation for Ultrasound in Medicine & Biology
引用
收藏
页码:387 / 395
相关论文
共 50 条
  • [1] Systematic Review and Meta-Analysis of O-RADS Ultrasound and O-RADS MRI for Risk Assessment of Ovarian and Adnexal Lesions
    Zhang, Qing
    Dai, Xiaoli
    Li, Wei
    AMERICAN JOURNAL OF ROENTGENOLOGY, 2023, 221 (01) : 21 - 33
  • [2] Automatic Detection and Classification of Focal Liver Lesions Based on Deep Convolutional Neural Networks: A Preliminary Study
    Zhou, Jiarong
    Wang, Wenzhe
    Lei, Biwen
    Ge, Wenhao
    Huang, Yu
    Zhang, Linshi
    Yan, Yingcai
    Zhou, Dongkai
    Ding, Yuan
    Wu, Jian
    Wang, Weilin
    FRONTIERS IN ONCOLOGY, 2021, 10
  • [3] Diagnostic performance of a modified O-RADS classification system for adnexal lesions incorporating clinical features
    Wu, Minrong
    Cai, Songqi
    Zhu, Liuhong
    Yang, Daohui
    Huang, Shunfa
    Huang, Xiaolan
    Tang, Qiying
    Guan, Yingying
    Rao, Shengxiang
    Zhou, Jianjun
    ABDOMINAL RADIOLOGY, 2024, : 953 - 965
  • [4] Developing a deep learning model for predicting ovarian cancer in Ovarian-Adnexal Reporting and Data System Ultrasound (O-RADS US) Category 4 lesions: A multicenter study
    Xie, Wenting
    Lin, Wenjie
    Li, Ping
    Lai, Hongwei
    Wang, Zhilan
    Liu, Peizhong
    Huang, Yijun
    Liu, Yao
    Tang, Lina
    Lyu, Guorong
    JOURNAL OF CANCER RESEARCH AND CLINICAL ONCOLOGY, 2024, 150 (07)
  • [5] Ultrasound image-based nomogram combining clinical, radiomics, and deep transfer learning features for automatic classification of ovarian masses according to O-RADS
    Liu, Lu
    Cai, Wenjun
    Tian, Hongyan
    Wu, Beibei
    Zhang, Jing
    Wang, Ting
    Hao, Yi
    Yue, Guanghui
    FRONTIERS IN ONCOLOGY, 2024, 14
  • [6] A simplified approach to ovarian lesions based on the O-RADS US risk stratification and management system
    Mohamadian, Alireza
    Bayani, Leila
    Katouli, Fatemeh Shakki
    ULTRASONOGRAPHY, 2023, 42 (01) : 165 - 171
  • [7] Deep chestX-ray: Detection and classification of lesions based on deep convolutional neural networks
    Cho, Yongwon
    Lee, Sang Min
    Cho, Young-Hoon
    Lee, June-Goo
    Park, Beomhee
    Lee, Gaeun
    Kim, Namkug
    Seo, Joon Beom
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2021, 31 (01) : 72 - 81
  • [8] A simplified approach to ovarian lesions based on the O-RADS US risk stratification and management system
    Mohamadian, Alireza
    Bayani, Leila
    Katouli, Fatemeh Shakki
    ULTRASONOGRAPHY, 2022, : 165 - 171
  • [9] O-RADS MRI Classification of Indeterminate Adnexal Lesions: Time-Intensity Curve Analysis Is Better Than Visual Assessment
    Wengert, Georg J.
    Dabi, Yohann
    Kermarrec, Edith
    Jalaguier-Coudray, Aurelie
    Poncelet, Edouard
    Porcher, Raphael
    Thomassin-Naggara, Isabelle
    Rockall, Andrea G.
    RADIOLOGY, 2022, 303 (03) : 566 - 575
  • [10] Automatic Classification of Requirements Based on Convolutional Neural Networks
    Winkler, Jonas
    Vogelsang, Andreas
    2016 IEEE 24TH INTERNATIONAL REQUIREMENTS ENGINEERING CONFERENCE WORKSHOPS (REW), 2016, : 39 - 45