Deep neural networks-based denoising models for CT imaging and their efficacy

被引:9
|
作者
Kc, Prabhat [1 ]
Zeng, Rongping [1 ]
Farhangi, M. Mehdi [1 ]
Myers, Kyle J. [1 ]
机构
[1] US FDA, Silver Spring, MD 20993 USA
关键词
CT image denoising; deep learning; neural networks; loss functions; image quality; LOW-DOSE CT; NUMBERS;
D O I
10.1117/12.2581418
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Most of the Deep Neural Networks (DNNs) based CT image denoising literature shows that DNNs outperform traditional iterative methods in terms of metrics such as the RMSE, the PSNR and the SSIM. In many instances, using the same metrics, the DNN results from low-dose inputs are also shown to be comparable to their high-dose counterparts. However, these metrics do not reveal if the DNN results preserve the visibility of subtle lesions or if they alter the CT image properties such as the noise texture. Accordingly, in this work, we seek to examine the image quality of the DNN results from a holistic viewpoint for low-dose CT image denoising. First, we build a library of advanced DNN denoising architectures. This library is comprised of denoising architectures such as the DnCNN, U-Net, Red-Net, GAN, etc. Next, each network is modeled, as well as trained, such that it yields its best performance in terms of the PSNR and SSIM. As such, data inputs (e.g. training patch-size, reconstruction kernel) and numeric-optimizer inputs (e.g. minibatch size, learning rate, loss function) are accordingly tuned. Finally, outputs from thus trained networks are further subjected to a series of CT bench testing metrics such as the contrast-dependent MTF, the NPS and the HU accuracy. These metrics are employed to perform a more nuanced study of the resolution of the DNN outputs' low-contrast features, their noise textures, and their CT number accuracy to better understand the impact each DNN algorithm has on these underlying attributes of image quality.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Assessing the Generalizability of Deep Neural Networks-Based Models for Black Skin Lesions
    Barros, Luana
    Chaves, Levy
    Avila, Sandra
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2023, PT II, 2024, 14470 : 1 - 14
  • [2] DEEP NEURAL NETWORKS-BASED VEHICLE DETECTION IN SATELLITE IMAGES
    Jiang, Qiling
    Cao, Liujuan
    Cheng, Ming
    Wang, Cheng
    Li, Jonathan
    2015 INTERNATIONAL SYMPOSIUM ON BIOELECTRONICS AND BIOINFORMATICS (ISBB), 2015, : 184 - 187
  • [3] Deep neural networks-based rolling bearing fault diagnosis
    Chen, Zhiqiang
    Deng, Shengcai
    Chen, Xudong
    Li, Chuan
    Sanchez, Rene-Vinicio
    Qin, Huafeng
    MICROELECTRONICS RELIABILITY, 2017, 75 : 327 - 333
  • [4] XcepCovidNet: deep neural networks-based COVID-19 diagnosis
    Juneja A.
    Kumar V.
    Kaur M.
    Singh D.
    Lee H.-N.
    Multimedia Tools and Applications, 2024, 83 (37) : 85195 - 85225
  • [5] ARTIFICIAL NEURAL NETWORKS-BASED ECONOMETRIC MODELS FOR TOURISM DEMAND FORECASTING
    Folgieri, Raffaella
    Baldigara, Tea
    Mamula, Maja
    4TH INTERNATIONAL SCIENTIFIC CONFERENCE: TOSEE - TOURISM IN SOUTHERN AND EASTERN EUROPE 2017, 2017, 4 : 169 - 182
  • [6] The classification and denoising of image noise based on deep neural networks
    Fan Liu
    Qingzeng Song
    Guanghao Jin
    Applied Intelligence, 2020, 50 : 2194 - 2207
  • [7] The classification and denoising of image noise based on deep neural networks
    Liu, Fan
    Song, Qingzeng
    Jin, Guanghao
    APPLIED INTELLIGENCE, 2020, 50 (07) : 2194 - 2207
  • [8] DENOISING DEEP NEURAL NETWORKS BASED VOICE ACTIVITY DETECTION
    Zhang, Xiao-Lei
    Wu, Ji
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 853 - 857
  • [9] Denoising of event-based sensors with deep neural networks
    Zhang, Zhihong
    Suo, Jinli
    Dai, Qionghai
    OPTOELECTRONIC IMAGING AND MULTIMEDIA TECHNOLOGY VIII, 2021, 11897
  • [10] Neural Networks-Based Cryptography: A Survey
    Meraouche, Ishak
    Dutta, Sabyasachi
    Tan, Haowen
    Sakurai, Kouichi
    IEEE Access, 2021, 9 : 124727 - 124740