Histogram Equalization Detection Based on Statistical Features in Digital Image

被引:0
|
作者
Bi X.-L. [1 ]
Qiu Y.-M. [1 ]
Xiao B. [1 ]
Li W.-S. [1 ]
Ma J.-F. [2 ]
机构
[1] Chongqing Key Laboratory of Image Cognition, Chongqing University of Posts and Telecommunications, Chongqing
[2] College of Network and Information Security, Xidian University, Xi'an
来源
基金
中国国家自然科学基金;
关键词
Cumulative distribution function; Histogram equalization detection; K-Nearest Neighbor classification; Statistical features;
D O I
10.11897/SP.J.1016.2021.00292
中图分类号
学科分类号
摘要
Nowadays, it is extremely easy to alter or fabricate an image. There are many unreal images widely spreading on the Internet. From the perspective of information security, the wrong information displayed by the unreal images will bring immeasurable harm. According to whether the content of image expression is changed or not, the types of tampered images can be divided into two major categories. The first type is malicious modification of image expression content according to requirements. This type of tampering mainly includes copy-move, splicing. The second kind of tampering mainly affects the visual effect of images. Such tampering operations are often regarded as post-processing steps of the first kind of tampering operations, such as compression, blurring and contrast enhancement. The purpose of this type of tampering is mainly to hide the difference between the tampered expression content and the untampered area in the image by adjusting the visual effect of the image, so as to cover up the tampering traces. In the previous image contrast enhancement tamper detection work, researchers have put forward a lot of excellent tamper detection methods. However, they also have some disadvantages, such as no support for post-compression processing and narrow application scope. Aiming at histogram equalization in image contrast enhancement, this paper proposes a histogram equalization detection method based on statistical features in images. In this paper, the image histogram and cumulative distribution function curve are used to extract the trace features of histogram equalization for realizing the digital forensic of histogram equalization. Firstly, the proposed method adaptively chooses the gray scale which is most likely to have the histogram equalization traces in the image by using the trend of the cumulative distribution function curve of the image. Because in the image histogram, not all the gray levels have reflected the image histogram equalization traces. It can be known that histogram equalization is realized by using cumulative distribution function, and the processed image has an almost flat and uniform histogram by redistributing pixel intensity. Therefore, histogram equalization will leave inherent and detectable traces in image cumulative distribution function and histogram. In the second step of our method, the similarity of cumulative distribution function and identity function, and the number of zero-value gaps of histogram are extracted as classification features respectively in this gray scale range. Due to the number of features is small, in order to save time, K-Nearest Neighbor (KNN) classification algorithm is used to quickly classify and judge whether the image is histogram equalization or not. Four sets of verification experiments are designed to verify the performance of this method from four aspects: feasibility of the method, applicability for small resolution images, robustness of post-compression processing and ability to distinguish histogram equalization from other contrast enhancement operations. Experimental results show that the proposed method can realize the stability detection of uncompressed image and JPEG compressed image. It is applicable to images of various resolutions and robust to post-compression processing. In addition, the method can distinguish histogram equalization and other contrast enhancement operations. © 2021, Science Press. All right reserved.
引用
收藏
页码:292 / 303
页数:11
相关论文
共 19 条
  • [1] Fridrich A J, Soukal B D, Lukas A J., Detection of copy-move forgery in digital images, Proceedings of the Digital Forensic Research Workshop, pp. 55-61, (2003)
  • [2] Shi Y Q, Chen C, Chen W., A natural image model approach to splicing detection, Proceedings of the 9th Workshop on Multimedia & Security, pp. 51-62, (2007)
  • [3] Fan Z, De Queiroz R L., Identification of bitmap compression history: JPEG detection and quantizer estimation, IEEE Transactions on Image Processing, 12, 2, pp. 230-235, (2003)
  • [4] Tong H, Li M, Zhang H, Et al., Blur detection for digital images using wavelet transform, Proceedings of the 2004 IEEE International Conference on Multimedia and Expo (ICME)(IEEE Cat. No. 04TH8763), pp. 17-20, (2004)
  • [5] Stamm M C, Liu K R., Forensic estimation and reconstruction of a contrast enhancement mapping, Proceedings of the 2010 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 1698-1701, (2010)
  • [6] Bi X, Pun C-M., Fast copy-move forgery detection using local bidirectional coherency error refinement, Pattern Recognition, 81, pp. 161-175, (2018)
  • [7] Bi X, Wei Y, Xiao B, Et al., RRU-Net: The ringed residual U-Net for image splicing forgery detection, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 30-39, (2019)
  • [8] Cao G, Zhao Y, Ni R., Forensic estimation of gamma correction in digital images, Proceedings of the 2010 IEEE International Conference on Image Processing, pp. 2097-2100, (2010)
  • [9] Cao G, Zhao Y, Ni R, Et al., Contrast enhancement-based forensics in digital images, IEEE Transactions on Information Forensics and Security, 9, 3, pp. 515-525, (2014)
  • [10] Yuan H-D., Identification of global histogram equalization by modeling gray-level cumulative distribution, Proceedings of the 2013 IEEE China Summit and International Conference on Signal and Information Processing, pp. 645-649, (2013)