Classification of multi-feature fusion ultrasound images of breast tumor within category 4 using convolutional neural networks

被引:1
|
作者
Xu, Pengfei [1 ]
Zhao, Jing [2 ]
Wan, Mingxi [1 ]
Song, Qing [3 ]
Su, Qiang [4 ]
Wang, Diya [1 ]
机构
[1] Xi An Jiao Tong Univ, Dept Biomed Engn, Sch Life Sci & Technol, Key Lab Biomed Informat Engn,Minist Educ, Xian 710049, Peoples R China
[2] Second Hosp Jilin Univ, Changchun, Peoples R China
[3] Xi An Jiao Tong Univ, Affiliated Hosp 1, Xian, Peoples R China
[4] Capital Med Univ, Beijing Friendship Hosp, Dept Oncol, Beijing 100050, Peoples R China
基金
北京市自然科学基金; 中国国家自然科学基金;
关键词
BI-RADS; 4; breast tumor; classification; CNN; false-positive; ultrasound; CANCER CLASSIFICATION; DATA SYSTEM; LESIONS; VARIABILITY; NAKAGAMI; MASSES;
D O I
10.1002/mp.16946
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Background: Breast tumor is a fatal threat to the health of women. Ultrasound (US) is a common and economical method for the diagnosis of breast cancer. Breast imaging reporting and data system (BI-RADS) category 4 has the highest false-positive value of about 30% among five categories. The classification task in BI-RADS category 4 is challenging and has not been fully studied. Purpose: This work aimed to use convolutional neural networks (CNNs) for breast tumor classification using B-mode images in category 4 to overcome the dependence on operator and artifacts. Additionally, this work intends to take full advantage of morphological and textural features in breast tumor US images to improve classification accuracy. Methods: First, original US images coming directly from the hospital were cropped and resized. In 1385 B-mode US BI-RADS category 4 images, the biopsy eliminated 503 samples of benign tumor and left 882 of malignant. Then, K-means clustering algorithm and entropy of sliding windows of US images were conducted. Considering the diversity of different characteristic information of malignant and benign represented by original B-mode images, K-means clustering images and entropy images, they are fused in a three-channel form multi-feature fusion images dataset. The training, validation, and test sets are 969, 277, and 139. With transfer learning, 11 CNN models including DenseNet and ResNet were investigated. Finally, by comparing accuracy, precision, recall, F1-score, and area under curve (AUC) of the results, models which had better performance were selected. The normality of data was assessed by Shapiro-Wilk test. DeLong test and independent t-test were used to evaluate the significant difference of AUC and other values. False discovery rate was utilized to ultimately evaluate the advantages of CNN with highest evaluation metrics. In addition, the study of anti-log compression was conducted but no improvement has shown in CNNs classification results. Results: With multi-feature fusion images, DenseNet121 has highest accuracy of 80.22 +/- 1.45% compared to other CNNs, precision of 77.97 +/- 2.89% and AUC of 0.82 +/- 0.01. Multi-feature fusion improved accuracy of DenseNet121 by 1.87% from classification of original B-mode images (p < 0.05). Conclusion: The CNNs with multi-feature fusion show a good potential of reducing the false-positive rate within category 4. The work illustrated that CNNs and fusion images have the potential to reduce false-positive rate in breast tumor within US BI-RADS category 4, and make the diagnosis of category 4 breast tumors to be more accurate and precise.
引用
收藏
页码:4243 / 4257
页数:15
相关论文
共 50 条
  • [1] Hyperspectral Images Classification Based on Multi-Feature Fusion and Hybrid Convolutional Neural Networks
    Feng Fan
    Wang Shuangting
    Zhang Jin
    Wang Chunyang
    [J]. LASER & OPTOELECTRONICS PROGRESS, 2021, 58 (08)
  • [2] Multi-Feature Fusion with Convolutional Neural Network for Ship Classification in Optical Images
    Ren, Yongmei
    Yang, Jie
    Zhang, Qingnian
    Guo, Zhiqiang
    [J]. APPLIED SCIENCES-BASEL, 2019, 9 (20):
  • [3] Semantic Segmentation of Images Based on Multi-Feature Fusion and Convolutional Neural Networks
    Wang, Zhenyu
    Xiao, Juan
    Zhang, Shuai
    Qi, Baoqiang
    [J]. JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2024, 33 (06)
  • [4] Breast Tumor Classification in Ultrasound Images by Fusion of Deep Convolutional Neural Network and Shallow LBP Feature
    Hua Chen
    Minglun Ma
    Gang Liu
    Ying Wang
    Zhihao Jin
    Chong Liu
    [J]. Journal of Digital Imaging, 2023, 36 : 932 - 946
  • [5] Breast Tumor Classification in Ultrasound Images by Fusion of Deep Convolutional Neural Network and Shallow LBP Feature
    Chen, Hua
    Ma, Minglun
    Liu, Gang
    Wang, Ying
    Jin, Zhihao
    Liu, Chong
    [J]. JOURNAL OF DIGITAL IMAGING, 2023, 36 (03) : 932 - 946
  • [6] Multi-feature fusion of convolutional neural networks for Fine-Grained ship classification
    Huang, Sizhe
    Xu, Huosheng
    Xia, Xuezhi
    Yang, Fan
    Zou, Fuhao
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 37 (01) : 125 - 135
  • [7] Multi-Instance Classification of Breast Tumor Ultrasound Images Using Convolutional Neural Networks and Transfer Learning
    Ciobotaru, Alexandru
    Bota, Maria Aurora
    Gota, Dan Ioan
    Miclea, Liviu Cristian
    [J]. BIOENGINEERING-BASEL, 2023, 10 (12):
  • [8] Convolutional neural network and multi-feature fusion for automatic modulation classification
    Wu, Hao
    Li, Yaxing
    Zhou, Liang
    Meng, Jin
    [J]. ELECTRONICS LETTERS, 2019, 55 (16) : 895 - +
  • [9] Automated detection of kidney abnormalities using multi-feature fusion convolutional neural networks
    Wu, Yu
    Yi, Zhang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2020, 200
  • [10] Breast tumor detection using multi-feature block based neural network by fusion of CT and MRI images
    Kumari, Bersha
    Nandal, Amita
    Dhaka, Arvind
    [J]. COMPUTATIONAL INTELLIGENCE, 2024, 40 (03)