A hybrid-Sudoku based fragile watermarking scheme for image tampering detection

被引:13
|
作者
Su, Guo-Dong [1 ,2 ]
Chang, Chin-Chen [2 ]
Chen, Chih-Cheng [3 ]
机构
[1] Fujian Prov Univ, Fujian Polytech Normal Univ, Engn Res Ctr ICH Digitalizat & Multisource Inform, Fuzhou 350300, Peoples R China
[2] Feng Chia Univ, Dept Informat Engn & Comp Sci, 100 Wenhwa Rd, Taichung 40724, Taiwan
[3] Jimei Univ, Sch Informat Engn, Xiamen 361021, Peoples R China
关键词
Fragile watermarking; Hybrid-Sudoku; Double perspective; Tampering detection; Image quality;
D O I
10.1007/s11042-020-10451-1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Protection of intellectual property rights has become one of the focuses of social concerning. To reduce those disputations, the watermark is implanted into the media by the content owner to declare original copyright. Among which, the high payload can achieve the accurate tampering localization, but maybe bring low visual quality watermarked image. Both accurate tampering localization and high watermarked image quality seem contradictory. For this, this paper proposes a hybrid-Sudoku based fragile watermarking scheme for digital image tampering detection. The hybrid-Sudoku based scheme provides a double perspective mechanism to embed the watermark. In the first perspective of the Sudoku, the watermark is virtually embedded into each pixel pair, resulting in the temporary coordinate information. Then, the temporary coordinate information is concealed into cover image based on the second perspective of the Sudoku. Finally, several common attacks are utilized to evaluate the tampering detection performance of the proposed method. Experimental results show that the proposed scheme can achieve a considerable accuracy in tampering localization and provide the satisfactory visual quality of watermarked image, with PSNR of 47.80 dB. Additionally, the design enhances the security and imperceptive of the hidden watermark.
引用
收藏
页码:12881 / 12903
页数:23
相关论文
共 50 条
  • [21] Image authentication scheme research based on fragile watermarking
    School of Computer Science and Technology, Harbin Engineering University, Harbin 150001, China
    [J]. Tien Tzu Hsueh Pao, 2007, 1 (34-39):
  • [22] A novel fragile watermarking scheme for image tamper detection and recovery
    朱少敏
    刘建明
    [J]. Chinese Optics Letters, 2010, 8 (07) : 661 - 665
  • [23] A novel fragile watermarking scheme for image tamper detection and recovery
    Zhu, Shaomin
    Liu, Jianming
    [J]. CHINESE OPTICS LETTERS, 2010, 8 (07) : 661 - 665
  • [24] Self Embedding Fragile Watermarking for Image Tampering Detection and Image Recovery using Self Recovery Blocks
    Dhole, Vinayak S.
    Patil, Nitin N.
    [J]. 1ST INTERNATIONAL CONFERENCE ON COMPUTING COMMUNICATION CONTROL AND AUTOMATION ICCUBEA 2015, 2015, : 752 - 757
  • [25] Secure private fragile watermarking scheme with improved tampering localisation accuracy
    Bravo-Solorio, S.
    Gan, L.
    Nandi, A. K.
    Aburdene, M. F.
    [J]. IET INFORMATION SECURITY, 2010, 4 (03) : 137 - 148
  • [26] A fragile watermarking scheme for medical image
    Wang Gang
    Rao Ni-Ni
    [J]. 2005 27TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-7, 2005, : 3406 - 3409
  • [27] Self-Embedding Fragile Watermarking Scheme to Detect Image Tampering Using AMBTC and OPAP Approaches
    Kim, Cheonshik
    Yang, Ching-Nung
    [J]. APPLIED SCIENCES-BASEL, 2021, 11 (03): : 1 - 21
  • [28] Fragile watermarking scheme for image authentication
    Lu, HT
    Shen, RM
    Chung, FL
    [J]. ELECTRONICS LETTERS, 2003, 39 (12) : 898 - 900
  • [29] Fragile Watermarking Scheme for Image Authentication
    Betancourth, Gerardo Pineda
    [J]. 2012 5TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTIONS (HSI 2012), 2012, : 168 - 174
  • [30] Dual image-based reversible fragile watermarking scheme for tamper detection and localization
    Sahu, Aditya Kumar
    Sahu, Monalisa
    Patro, Pramoda
    Sahu, Gupteswar
    Nayak, Soumya Ranjan
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2023, 26 (02) : 571 - 590