Active Fine-Tuning From gMAD Examples Improves Blind Image Quality Assessment

被引:19
|
作者
Wang, Zhihua [1 ]
Ma, Kede [1 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Computational modeling; Databases; Adaptation models; Training; Predictive models; Task analysis; Image quality; Blind image quality assessment; deep neural networks; gMAD competition; active learning; STATISTICS; INDEX;
D O I
10.1109/TPAMI.2021.3071759
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The research in image quality assessment (IQA) has a long history, and significant progress has been made by leveraging recent advances in deep neural networks (DNNs). Despite high correlation numbers on existing IQA datasets, DNN-based models may be easily falsified in the group maximum differentiation (gMAD) competition. Here we show that gMAD examples can be used to improve blind IQA (BIQA) methods. Specifically, we first pre-train a DNN-based BIQA model using multiple noisy annotators, and fine-tune it on multiple synthetically distorted images, resulting in a "top-performing" baseline model. We then seek pairs of images by comparing the baseline model with a set of full-reference IQA methods in gMAD. The spotted gMAD examples are most likely to reveal the weaknesses of the baseline, and suggest potential ways for refinement. We query human quality annotations for the selected images in a well-controlled laboratory environment, and further fine-tune the baseline on the combination of human-rated images from gMAD and existing databases. This process may be iterated, enabling active fine-tuning from gMAD examples for BIQA. We demonstrate the feasibility of our active learning scheme on a large-scale unlabeled image set, and show that the fine-tuned quality model achieves improved generalizability in gMAD, without destroying performance on previously seen databases.
引用
收藏
页码:4577 / 4590
页数:14
相关论文
共 50 条
  • [21] Spiking neural networks fine-tuning for brain image segmentation
    Yue, Ye
    Baltes, Marc
    Abuhajar, Nidal
    Sun, Tao
    Karanth, Avinash
    Smith, Charles D.
    Bihl, Trevor
    Liu, Jundong
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [22] Detection of abnormal fish by image recognition using fine-tuning
    Okawa, Ryusei
    Iwasaki, Nobuo
    Okamoto, Kazuya
    Marsh, David
    ARTIFICIAL LIFE AND ROBOTICS, 2023, 28 (01) : 175 - 180
  • [23] Factorized Convolutional Networks: Unsupervised Fine-Tuning for Image Clustering
    Gui, Liang-Yan
    Gui, Liangke
    Wang, Yu-Xiong
    Morency, Louis-Philippe
    Moura, Jose M. F.
    2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2018), 2018, : 1205 - 1214
  • [24] Detection of abnormal fish by image recognition using fine-tuning
    Ryusei Okawa
    Nobuo Iwasaki
    Kazuya Okamoto
    David Marsh
    Artificial Life and Robotics, 2023, 28 : 175 - 180
  • [25] Acceleration from M theory and fine-tuning
    Gong, Yungui
    Wang, Anzhong
    CLASSICAL AND QUANTUM GRAVITY, 2006, 23 (10) : 3419 - 3426
  • [26] PRIOR PROBABILITIES IN THE ARGUMENT FROM FINE-TUNING
    Swinburne, Richard
    FAITH AND PHILOSOPHY, 2005, 22 (05) : 641 - 653
  • [27] Learning from models beyond fine-tuning
    Zheng, Hongling
    Shen, Li
    Tang, Anke
    Luo, Yong
    Hu, Han
    Du, Bo
    Wen, Yonggang
    Tao, Dacheng
    NATURE MACHINE INTELLIGENCE, 2025, 7 (01) : 6 - 17
  • [28] Fine-tuning of conditional Transformers improves in silico enzyme prediction and generation
    Nicolini, Marco
    Saitto, Emanuele
    Jimenez Franco, Ruben Emilio
    Cavalleri, Emanuele
    Galeano Alfonso, Aldo Javier
    Malchiodi, Dario
    Paccanaro, Alberto
    Robinson, Peter N.
    Casiraghi, Elena
    Valentini, Giorgio
    Computational and Structural Biotechnology Journal, 2025, 27 : 1318 - 1334
  • [29] AdapterGNN: Parameter-Efficient Fine-Tuning Improves Generalization in GNNs
    Li, Shengrui
    Han, Xueting
    Bai, Jing
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13600 - 13608
  • [30] Fine-tuning the CAR spacer improves T-cell potency
    Watanabe, Norihiro
    Bajgain, Pradip
    Sukumaran, Sujita
    Ansari, Salma
    Heslop, Helen E.
    Rooney, Cliona M.
    Brenner, Malcolm K.
    Leen, Ann M.
    Vera, Juan F.
    ONCOIMMUNOLOGY, 2016, 5 (12):