A framework of generative adversarial networks with novel loss for JPEG restoration and anti-forensics

被引:1
|
作者
Wu, Jianyuan [1 ]
Kang, Xiangui [1 ]
Yang, Jianhua [1 ]
Sun, Wei [2 ]
机构
[1] Sun Yat Sen Univ, Guangdong Key Lab Informat Secur Technol, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Sch Elect & Informat Engn, Informat Technol Key Lab, Minist Educ, Guangzhou 510006, Peoples R China
基金
中国博士后科学基金;
关键词
JPEG restoration; JPEG anti-forensics; AC-Component loss; Calibration loss; GAN framework; DCT;
D O I
10.1007/s00530-021-00778-6
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Both JPEG restoration and anti-forensics remove the artifacts left by JPEG compression, and recover the JPEG compressed image. However, how to restore the high-frequency details of a JPEG compressed image for JPEG restoration and how to deceive the existing JPEG compression detectors without sacrificing visual quality in JPEG anti-forensics remain challenging. To address these issues, a framework of generative adversarial networks (GAN) with novel loss functions for JPEG restoration and anti-forensics (JRA-GAN) is proposed to allow a JPEG compressed image to be translated into a reconstructed one. Since JPEG compression causes impairment to high-frequency components, an alternating current (AC)-component loss function that measures the loss of AC components is proposed in JRA-GAN to recover these components. To prevent forensic detection, a calibration loss function is also introduced in JRA-GAN to mitigate the variance gap in the high-frequency subbands between a generated image and its calibrated version. Our experimental results demonstrate that the proposed JPEG restoration method outperforms existing methods in terms of image quality, and the JPEG anti-forensic scheme achieves better visual quality and anti-forensic performance that is comparable to the existing state-of-the-art anti-forensic methods. Our code is available in this page: .
引用
收藏
页码:1075 / 1089
页数:15
相关论文
共 50 条
  • [21] Modify the Quantization Table in the JPEG Header File for Forensics and Anti-forensics
    Wang, Hao
    Wang, Jinwei
    Luo, Xiangyang
    Yin, QiLin
    Ma, Bin
    Sun, Jinsheng
    [J]. DIGITAL FORENSICS AND WATERMARKING, IWDW 2021, 2022, 13180 : 72 - 86
  • [22] An approach to expose dithering-based JPEG anti-forensics
    Bhardwaj, Dinesh
    Pankajakshan, Vinod
    [J]. FORENSIC SCIENCE INTERNATIONAL, 2021, 328
  • [23] UNDETECTABLE IMAGE TAMPERING THROUGH JPEG COMPRESSION ANTI-FORENSICS
    Stamm, Matthew C.
    Tjoa, Steven K.
    Lin, W. Sabrina
    Liu, K. J. Ray
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 2109 - 2112
  • [24] Countering JPEG anti-forensics based on noise level estimation
    Hui Zeng
    Jingjing Yu
    Xiangui Kang
    Siwei Lyu
    [J]. Science China Information Sciences, 2018, 61
  • [25] Countering JPEG anti-forensics based on noise level estimation
    Zeng, Hui
    Yu, Jingjing
    Kang, Xiangui
    Lyu, Siwei
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2018, 61 (03)
  • [26] Countering JPEG anti-forensics based on noise level estimation
    Hui ZENG
    Jingjing YU
    Xiangui KANG
    Siwei LYU
    [J]. Science China(Information Sciences), 2018, 61 (03) : 29 - 42
  • [27] Anti-forensics of double JPEG compression with the same quantization matrix
    Haodong Li
    Weiqi Luo
    Jiwu Huang
    [J]. Multimedia Tools and Applications, 2015, 74 : 6729 - 6744
  • [28] Anti-forensics of double JPEG compression with the same quantization matrix
    Li, Haodong
    Luo, Weiqi
    Huang, Jiwu
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2015, 74 (17) : 6729 - 6744
  • [29] A Framework for Detecting Anti-forensics in Cloud Environment
    Rani, Deevi Radha
    Kumari, G. Geetha
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND AUTOMATION (ICCCA), 2016, : 1277 - 1280
  • [30] Anti-Forensics for Face Swapping Videos via Adversarial Training
    Ding, Feng
    Zhu, Guopu
    Li, Yingcan
    Zhang, Xinpeng
    Atrey, Pradeep K.
    Lyu, Siwei
    [J]. IEEE Transactions on Multimedia, 2022, 24 : 3429 - 3441