HVS-Based Perception-Driven No-Reference Omnidirectional Image Quality Assessment

被引:11
|
作者
Liu, Yun [1 ]
Yin, Xiaohua [1 ]
Wang, Yan [1 ]
Yin, Zixuan [1 ]
Zheng, Zhi [2 ]
机构
[1] Liaoning Univ, Coll Informat, Shenyang 110036, Liaoning, Peoples R China
[2] Beijing Jiaotong Univ, Dept Elect & Informat Engn, Beijing 100091, Peoples R China
基金
中国国家自然科学基金;
关键词
Visualization; Image quality; Distortion; Predictive models; Feature extraction; Degradation; Taylor series; Human visual system (HVS); omnidirectional images; quality assessment; support vector regression (SVR); visual attention mechanism; CLASSIFICATION; INDEX;
D O I
10.1109/TIM.2022.3232792
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Evaluating the quality of panoramic images has gradually become a hot research topic with the development of virtual reality (VR) technology. Therefore, a novel method is proposed to assess the quality of omnidirectional images without any reference information. Inspired by the characteristics of the human visual system (HVS) and visual attention mechanism, the proposed model is composed of the structure feature, statistical feature, and saliency feature to measure the panoramic image quality, in which structure information is expressed by combining the local Taylor series with the local binary pattern (LBP) operator, gradient-based statistical information of panoramic images are summarized comprehensively from three levels: the gradient measure, the relative gradient magnitude, and the relative gradient orientation, and the saliency detection by combining simple priors (SDSP)-based saliency information is extracted in this article to enrich perception feature of our model and improve the visibility of the saliency region in the omnidirectional image. Finally, according to the subjective scores provided and the above features, we use support vector regression (SVR) to predict the objective scores. The experiments indicate that our model has more substantial competitiveness and stability than other state-of-the-art methods on two reliable databases.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [21] No-Reference Stereo Image Quality Assessment Based on Image Fusion
    Huang Shuyu
    Sang Qingbing
    LASER & OPTOELECTRONICS PROGRESS, 2019, 56 (07)
  • [22] No-reference image quality assessment based on sparse representation
    Yang, Xichen
    Sun, Quansen
    Wang, Tianshu
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (10): : 6643 - 6658
  • [23] No-reference image quality assessment based on sparse representation
    Xichen Yang
    Quansen Sun
    Tianshu Wang
    Neural Computing and Applications, 2019, 31 : 6643 - 6658
  • [24] CNN Based No-Reference HDR Image Quality Assessment
    Kefeng, Fan
    Jiyun, Liang
    Fei, Li
    Puye, Qiu
    CHINESE JOURNAL OF ELECTRONICS, 2021, 30 (02) : 282 - 288
  • [25] A no-reference panoramic image quality assessment with hierarchical perception and color features
    Liu, Yun
    Yin, Xiaohua
    Tang, Chang
    Yue, Guanghui
    Wang, Yan
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 95
  • [26] No-reference Image Quality Assessment Based on Differential Excitation
    Chen Y.
    Wu M.-M.
    Fang H.
    Liu H.-L.
    Zidonghua Xuebao/Acta Automatica Sinica, 2020, 46 (08): : 1727 - 1737
  • [27] An improved model for no-reference image quality assessment and a no-reference video quality assessment model based on frame analysis
    Mukesh Kumar Rohil
    Neetika Gupta
    Prakash Yadav
    Signal, Image and Video Processing, 2020, 14 : 205 - 213
  • [28] NO-REFERENCE IMAGE QUALITY ASSESSMENT BASED ON VISUAL CODEBOOK
    Ye, Peng
    Doermann, David
    2011 18TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2011,
  • [29] No-reference image quality assessment based on hybrid model
    Li, Jie
    Yan, Jia
    Deng, Dexiang
    Shi, Wenxuan
    Deng, Songfeng
    SIGNAL IMAGE AND VIDEO PROCESSING, 2017, 11 (06) : 985 - 992
  • [30] CNN Based No-Reference HDR Image Quality Assessment
    FAN Kefeng
    LIANG Jiyun
    LI Fei
    QIU Puye
    Chinese Journal of Electronics, 2021, 30 (02) : 282 - 288