No-Reference Quality Metric of Contrast-Distorted Images Based on Information Maximization

被引:287
|
作者
Gu, Ke [1 ]
Lin, Weisi [1 ]
Zhai, Guangtao [2 ]
Yang, Xiaokang [2 ]
Zhang, Wenjun [2 ]
Chen, Chang Wen [3 ]
机构
[1] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[2] Shanghai Jiao Tong Univ, Inst Image Commun & Informat Proc, Shanghai 200240, Peoples R China
[3] SUNY Buffalo, Buffalo, NY 14260 USA
基金
美国国家科学基金会;
关键词
Contrast distortion; image quality assessment (IQA); information maximization; no-reference/blind; saliency; FREE-ENERGY PRINCIPLE;
D O I
10.1109/TCYB.2016.2575544
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The general purpose of seeing a picture is to attain information as much as possible. With it, we in this paper devise a new no-reference/blind metric for image quality assessment (IQA) of contrast distortion. For local details, we first roughly remove predicted regions in an image since unpredicted remains are of much information. We then compute entropy of particular unpredicted areas of maximum information via visual saliency. From global perspective, we compare the image histogram with the uniformly distributed histogram of maximum information via the symmetric Kullback-Leibler divergence. The proposed blind IQA method generates an overall quality estimation of a contrast-distorted image by properly combining local and global considerations. Thorough experiments on five databases/subsets demonstrate the superiority of our training-free blind technique over state-of-the-art full-and no-reference IQA methods. Furthermore, the proposed model is also applied to amend the performance of general-purpose blind quality metrics to a sizable margin.
引用
收藏
页码:4559 / 4565
页数:7
相关论文
共 50 条
  • [1] No-Reference Quality Assessment for Contrast-Distorted Images
    Liu, Yutao
    Li, Xiu
    [J]. IEEE ACCESS, 2020, 8 : 84105 - 84115
  • [2] No-reference Quality Assessment of Contrast-Distorted Images
    Xu, Min
    Wang, Zhiming
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP), 2016, : 362 - 367
  • [3] No-Reference Quality Assessment of Contrast-Distorted Images Based on Natural Scene Statistics
    Fang, Yuming
    Ma, Kede
    Wang, Zhou
    Lin, Weisi
    Fang, Zhijun
    Zhai, Guangtao
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (07) : 838 - 842
  • [4] No-Reference Quality Assessment for Contrast-Distorted Image
    Wu, Jun
    Xia, Zhaoqiang
    Ren, Yifeng
    Li, Huifang
    [J]. 2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,
  • [5] No-reference quality metric for contrast-distorted image based on gradient domain and HSV space
    Lyu, Wenjing
    Lu, Wei
    Ma, Ming
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2020, 69
  • [6] No-reference quality assessment for contrast-distorted images based on multifaceted statistical representation of structure
    Zhou, Yu
    Li, Leida
    Zhu, Hancheng
    Liu, Hantao
    Wang, Shiqi
    Zhao, Yao
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2019, 60 (158-169) : 158 - 169
  • [7] Effective quality metric for contrast-distorted images based on SVD
    Wu, Leyuan
    Zhang, Xiaogang
    Chen, Hua
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 78 : 254 - 262
  • [8] Effect of Contrast Measures on the Performance of No-Reference Image Quality Assessment Algorithm for Contrast-Distorted Images
    Al-Najjar, Yusra A.
    [J]. JORDAN JOURNAL OF ELECTRICAL ENGINEERING, 2021, 7 (04): : 390 - 404
  • [9] A no-reference quality assessment for contrast-distorted image based on improved learning method
    Wu, Yaojun
    Zhu, Yonghe
    Yang, Yang
    Zhang, Weiming
    Yu, Nenghai
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (08) : 10057 - 10076
  • [10] No-reference Quality Assessment for Contrast-distorted Images Based on Gray and Color-gray-difference Space
    Yang, Yang
    Ding, Yingqiu
    Cheng, Ming
    Zhang, Weiming
    [J]. ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2023, 19 (02)