Boosting Noise Reduction Effect via Unsupervised Fine-Tuning Strategy

被引:1
|
作者
Jiang, Xinyi [1 ]
Xu, Shaoping [1 ]
Wu, Junyun [1 ]
Zhou, Changfei [1 ]
Ji, Shuichen [1 ]
机构
[1] Nanchang Univ, Sch Math & Comp Sci, Nanchang 330031, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 05期
关键词
boosting denoising effect; supervised denoising models; data bias; unsupervised denoising models; flexibility; fine-tuning; IMAGE; SPARSE;
D O I
10.3390/app14051742
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Over the last decade, supervised denoising models, trained on extensive datasets, have exhibited remarkable performance in image denoising, owing to their superior denoising effects. However, these models exhibit limited flexibility and manifest varying degrees of degradation in noise reduction capability when applied in practical scenarios, particularly when the noise distribution of a given noisy image deviates from that of the training images. To tackle this problem, we put forward a two-stage denoising model that is actualized by attaching an unsupervised fine-tuning phase after a supervised denoising model processes the input noisy image and secures a denoised image (regarded as a preprocessed image). More specifically, in the first stage we replace the convolution block adopted by the U-shaped network framework (utilized in the deep image prior method) with the Transformer module, and the resultant model is referred to as a U-Transformer. The U-Transformer model is trained to preprocess the input noisy images using noisy images and their labels. As for the second stage, we condense the supervised U-Transformer model into a simplified version, incorporating only one Transformer module with fewer parameters. Additionally, we shift its training mode to unsupervised training, following a similar approach as employed in the deep image prior method. This stage aims to further eliminate minor residual noise and artifacts present in the preprocessed image, resulting in clearer and more realistic output images. Experimental results illustrate that the proposed method achieves significant noise reduction in both synthetic and real images, surpassing state-of-the-art methods. This superiority stems from the supervised model's ability to rapidly process given noisy images, while the unsupervised model leverages its flexibility to generate a fine-tuned network, enhancing noise reduction capability. Moreover, with support from the supervised model providing higher-quality preprocessed images, the proposed unsupervised fine-tuning model requires fewer parameters, facilitating rapid training and convergence, resulting in overall high execution efficiency.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] APPT: Boosting Automated Patch Correctness Prediction via Fine-Tuning Pre-Trained Models
    Zhang, Quanjun
    Fang, Chunrong
    Sun, Weisong
    Liu, Yan
    He, Tieke
    Hao, Xiaodong
    Chen, Zhenyu
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2024, 50 (03) : 474 - 494
  • [22] Unsupervised Fine-tuning of Optical Flow for Better Motion Boundary Estimation
    Alhersh, Taha
    Stuckenschmidt, Heiner
    PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 5, 2019, : 776 - 783
  • [23] Empirical analysis of the fine-tuning for Unsupervised Anomaly Detection in the ICT system
    Matsuo, Yoichi
    2023 19TH INTERNATIONAL CONFERENCE ON NETWORK AND SERVICE MANAGEMENT, CNSM, 2023,
  • [24] Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning
    Ghalandari, Demian Gholipour
    Hokamp, Chris
    Ifrim, Georgiana
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1267 - 1280
  • [25] Fine-Tuning the Wilson-Bappu Effect
    Gomez, T.
    Wallerstein, G.
    16TH CAMBRIDGE WORKSHOP ON COOL STARS, STELLAR SYSTEMS AND THE SUN, 2012, 448 : E1123 - E1129
  • [26] Adaptive fine-tuning strategy for few-shot learning
    Zhuang, Xinkai
    Shao, Mingwen
    Gao, Wei
    Yang, Jianxin
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (06)
  • [27] A Fine-Tuning Strategy Based on Real Scenes in Gait Identification
    Zhang, Xianggang
    Zeng, Jing
    Wang, Guoyu
    UBIQUITOUS SECURITY, 2022, 1557 : 336 - 350
  • [28] Boosting Diagnostic Accuracy of Osteoporosis in Knee Radiograph Through Fine-Tuning CNN
    Kumar, Saumya
    Goswami, Puneet
    Batra, Shivani
    BIG DATA ANALYTICS IN ASTRONOMY, SCIENCE, AND ENGINEERING, BDA 2023, 2024, 14516 : 97 - 109
  • [29] Hierarchical BERT with an adaptive fine-tuning strategy for document classification
    Kong, Jun
    Wang, Jin
    Zhang, Xuejie
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [30] Hybrid Fine-Tuning Strategy for Few-Shot Classification
    Zhao, Lei
    Ou, Zhonghua
    Zhang, Lixun
    Li, Shuxiao
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022