Transfer learning for just noticeable difference estimation

被引:2
|
作者
Mao, Yongwei [1 ]
Wu, Jinjian [1 ,2 ]
Wang, Xing [1 ]
Li, Leida [1 ]
Dong, Weisheng [1 ]
机构
[1] Xidian Univ, Sch Artificial Intelligence, Xian 710071, Peoples R China
[2] Pazhou Lab, Huangpu 510555, Peoples R China
基金
中国国家自然科学基金;
关键词
Just noticeable difference; Deep transfer learning; Full reference image quality assessment; Local perceived discrepancy; Residual dense network; IMAGE QUALITY ASSESSMENT; MODEL; INFORMATION; VISIBILITY;
D O I
10.1016/j.ins.2023.119575
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The just noticeable difference (JND) measures the visual redundancy of digital images and is widely used in signal processing. Conventional JND models attempt to simulate functional properties of the human visual system (HVS), which are limited by the development of cognitive psychology. In this paper, we propose a novel pixel-wise JND prediction model based on deep transfer learning. Since it is almost impossible to manually label each pixel's visibility threshold, lacking labeled training data is the crucial issue. Transfer learning addresses the problem of insufficient training data. We found an underlying correspondence between full reference image quality assessment (FR-IQA) and JND estimation, which implies that knowledge related to FR-IQA can be applied to JND estimation. To quantify the intrinsic association between JND estimation and FR-IQA, a local perceived discrepancy (LPD) index is deduced. With the guidance of the LPD index, a JND predictor based on residual dense network (RDN) is designed to discover good representations of visibility limitation from annotated image quality databases. Subjective viewing test experiments show that our model outperforms the state-of-the-art JND models. Furthermore, we apply our model to image compression, and around 14.42% of the bit rate can be reduced by removing visual redundancy.
引用
下载
收藏
页数:16
相关论文
共 50 条
  • [31] Just Noticeable Defocus Blur Detection and Estimation
    Shi, Jianping
    Xu, Li
    Jia, Jiaya
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 657 - 665
  • [32] Just noticeable color difference: implications for display systems
    Desjardins, Daniel D.
    Gardner, Patrick
    DISPLAY TECHNOLOGIES AND APPLICATIONS FOR DEFENSE, SECURITY, AND AVIONICS IX; AND HEAD- AND HELMET-MOUNTED DISPLAYS XX, 2015, 9470
  • [33] A just noticeable difference in C50 for speech
    Bradley, JS
    Reich, R
    Norcross, SG
    APPLIED ACOUSTICS, 1999, 58 (02) : 99 - 108
  • [34] Just Noticeable Difference Model for Images with Color Sensitivity
    Zhang, Zhao
    Shang, Xiwu
    Li, Guoping
    Wang, Guozhong
    SENSORS, 2023, 23 (05)
  • [35] Relation of loudness to loudness just noticeable difference of tone
    LONG Zhangcai TANG Chaoqun (Department of Physics
    Chinese Journal of Acoustics, 2000, (03) : 240 - 246
  • [36] The just noticeable difference of noise length and reverberation perception
    Meng, Zihou
    Zhao, Fengjie
    He, Mu
    2006 INTERNATIONAL SYMPOSIUM ON COMMUNICATIONS AND INFORMATION TECHNOLOGIES,VOLS 1-3, 2006, : 468 - +
  • [37] Just noticeable difference of image attributes for natural images
    Qin, Shaoling
    Ge, Shu
    Yin, HanChun
    Xia, Jun
    Liu, Lu
    Teunissen, Kees
    Heynderickx, Ingrid
    2007 SID INTERNATIONAL SYMPOSIUM, DIGEST OF TECHNICAL PAPERS, VOL XXXVIII, BOOKS I AND II, 2007, 38 : 326 - +
  • [38] The Influence of Visual Stimuli on Just Noticeable Difference of Loudness
    Zhao, Zhijun
    Xie, Lingyun
    2012 5TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP), 2012, : 1673 - 1676
  • [39] Just Noticeable Difference Ontogenesis, Performativity and the Perceptual Gap
    Salter, Chris
    INFLEXIONS-A JOURNAL FOR RESEARCH CREATION, 2012, (05): : 111 - 129
  • [40] Crowdsourced Estimation of Collective Just Noticeable Difference for Compressed Video with the Flicker Test and QUEST+
    Jenadeleh M.
    Hamzaoui R.
    Reips U.
    Saupe D.
    IEEE Transactions on Circuits and Systems for Video Technology, 2024, 34 (10) : 1 - 1