Objective quality assessment of displayed images by using neural networks

被引:39
|
作者
Gastaldo, P
Zunino, R
Heynderickx, I
Vicario, E
机构
[1] Univ Genoa, DIBE, I-16145 Genoa, Italy
[2] Philips Res Labs, NL-5656 AA Eindhoven, Netherlands
关键词
perceptual quality; objective image quality; neural networks;
D O I
10.1016/j.image.2005.03.013
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Considerable research effort is being devoted to the development of image-enhancement algorithms, which improve the quality of displayed digital pictures. Reliable methods for measuring perceived image quality are needed to evaluate the performances of those algorithms, and such measurements require a univariant (i.e., no-reference) approach. The system presented in this paper applies concepts derived from computational intelligence, and supports an objective quality-assessment method based on a circular back-propagation (CBP) neural model. The network is trained to predict quality ratings, as scored by human assessors, from numerical features that characterize images. As such, the method aims at reproducing perceived image quality, rather than defining a comprehensive model of the human visual system. The connectionist approach allows one to decouple the task of feature selection from the consequent mapping of features into an objective quality score. Experimental results on the perceptual effects of a family of contrast-enhancement algorithms confirm the method effectiveness, as the system renders quite accurately the image quality perceived by human assessors. (c) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:643 / 661
页数:19
相关论文
共 50 条
  • [1] Using images pattern recognition and neural networks for coating Quality Assessment
    Chang, LM
    Abdelrazig, YA
    [J]. DURABILITY OF BUILDING MATERIALS AND COMPONENTS 8, VOLS 1-4, PROCEEDINGS, 1999, : 2429 - 2440
  • [2] Objective Video Quality Assessment Based on Neural Networks
    Menor, Diego P. A.
    Mello, Carlos A. B.
    Zanchettin, Cleber
    [J]. KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS: PROCEEDINGS OF THE 20TH INTERNATIONAL CONFERENCE KES-2016, 2016, 96 : 1551 - 1559
  • [3] Blind Quality Assessment of Multiply Distorted Images Using Deep Neural Networks
    Wang, Zhongling
    Athar, Shahrukh
    Wang, Zhou
    [J]. IMAGE ANALYSIS AND RECOGNITION, ICIAR 2019, PT I, 2019, 11662 : 89 - 101
  • [4] No-reference quality assessment of JPEG images by using CBP neural networks
    Gastaldo, P
    Zunino, R
    [J]. 2004 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5, PROCEEDINGS, 2004, : 772 - 775
  • [5] Automated Quality Assessment of Cardiac MR Images Using Convolutional Neural Networks
    Zhang, Le
    Gooya, Ali
    Dong, Bo
    Hua, Rui
    Petersen, Steffen E.
    Medrano-Gracia, Pau
    Frangi, Alejandro F.
    [J]. SIMULATION AND SYNTHESIS IN MEDICAL IMAGING, SASHIMI 2016, 2016, 9968 : 138 - 145
  • [6] No-reference quality assessment of JPEG images by using CBP neural networks
    Gastaldo, Paolo
    Parodi, Giovanni
    Redi, Judith
    Zunino, Rodolfo
    [J]. ARTIFICIAL NEURAL NETWORKS - ICANN 2007, PT 2, PROCEEDINGS, 2007, 4669 : 564 - +
  • [7] Speech quality objective assessment using neural network
    Fu, Q
    Yi, KC
    Sun, MG
    [J]. 2000 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, PROCEEDINGS, VOLS I-VI, 2000, : 1511 - 1514
  • [8] Objective quality assessment of MPEG-2 video streams by using CBP neural networks
    Gastaldo, P
    Rovetta, S
    Zunino, R
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (04): : 939 - 947
  • [9] Creativity Assessment by Analyzing Images Using Neural Networks
    Uglanova, I. L.
    Gel'ver, E. S.
    Tarasov, S. V.
    Gracheva, D. A.
    Vyrva, E. E.
    [J]. SCIENTIFIC AND TECHNICAL INFORMATION PROCESSING, 2022, 49 (05) : 371 - 378
  • [10] Creativity Assessment by Analyzing Images Using Neural Networks
    I. L. Uglanova
    E. S. Gel’ver
    S. V. Tarasov
    D. A. Gracheva
    E. E. Vyrva
    [J]. Scientific and Technical Information Processing, 2022, 49 : 371 - 378