A stable GAN for image steganography with multi-order feature fusion

被引:5
|
作者
Zhao, Junfeng [1 ]
Wang, Shen [1 ]
机构
[1] Harbin Inst Technol, Sch Cyberspace Sci, Harbin 150000, Heilongjiang, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2022年 / 34卷 / 18期
关键词
Steganography; Steganalysis; GAN; Model pruning; STEGANALYSIS; FRAMEWORK;
D O I
10.1007/s00521-022-07270-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Content-adaptive automatic cost learning frameworks for image steganography based on deep learning can generate a more exquisite embedding probability map within a short time; such methods have reached remarkable security performance compared with conventional handcraft-based methods. However, some issues in deep steganography are not discussed in spatial domain: (1) the key point to the design of generator has not reached clearly; and (2) existing methods are unstable of model training due to vanishing gradient problem. To investigate these issues, this paper proposes a stable GAN (generative adversarial network) for image steganography called UMC-GAN, which presents a redesigned and adjustable nested U-Shape generator and utilizes deep supervision to fuse multiple embedding probability maps to improve security performance. A novel linear-clipped embedding simulator is designed to alleviate vanishing gradient problem at the staircase regions. Extensive experiments and ablation studies show that the proposed method outperforms existing GAN-based automatic cost learning embedding frameworks, and it can be applied at high resolution through the flexible adjustment of the generator. Further investigation on the design of generator is explored by model pruning which shows that in-depth features should be captured for deep steganography to ensure the security performance.
引用
收藏
页码:16073 / 16088
页数:16
相关论文
共 50 条
  • [21] Multi-Order Neighborhood Fusion Based Multi-View Deep Subspace Clustering
    Zhou, Kai
    Bai, Yanan
    Hu, Yongli
    Wang, Boyue
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (03): : 3873 - 3890
  • [22] Incomplete Multi-view Clustering Algorithm Based on Multi-order Neighborhood Fusion
    Liu X.-L.
    Bai L.
    Zhao X.-W.
    Liang J.-Y.
    Ruan Jian Xue Bao/Journal of Software, 2022, 33 (04): : 1354 - 1372
  • [23] Feature optimization based on multi-order fusion and adaptive recursive elimination for motion classification in doppler radarFeature optimization based on multi-order fusion and adaptive recursive elimination for motion classification in doppler radarT. Sun et al.
    Tong Sun
    Yipeng Ding
    Yuxin Chen
    Lv Ping
    Applied Intelligence, 2025, 55 (7)
  • [24] Multi-Order Feature Statistical Model for Fine-Grained Visual Categorization
    Wang, Qingtao
    Zhang, Ke
    Fan, Jin
    Huang, Shaoli
    Zhang, Lianbo
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7379 - 7386
  • [25] Intensive Multi-order Feature Extraction for Incipient Fault Detection of Inverter System
    Wang, Min
    Cheng, Feiyang
    Xie, Min
    Qiu, Gen
    Zhang, Jingxin
    IEEE Transactions on Power Electronics, 2024,
  • [26] Augmented Image Retrieval using Multi-Order Object Layout with Attributes
    Cao, Xiaochun
    Wei, Xingxing
    Guo, Xiaojie
    Han, Yahong
    Tang, Jinhui
    PROCEEDINGS OF THE 2014 ACM CONFERENCE ON MULTIMEDIA (MM'14), 2014, : 1093 - 1096
  • [27] MULTI-ORDER ADVERSARIAL REPRESENTATION LEARNING FOR COMPOSED QUERY IMAGE RETRIEVAL
    Fu, Zhixiao
    Chen, Xinyuan
    Dong, Jianfeng
    Ji, Shouling
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 1685 - 1689
  • [28] VidaGAN: Adaptive GAN for image steganography
    Ramandi, Vida Yousefi
    Fateh, Mansoor
    Rezvani, Mohsen
    IET IMAGE PROCESSING, 2024, 18 (12) : 3329 - 3342
  • [29] MULTI-ORDER MICRODIFFUSION ANALYSIS
    ISHIHARA, H
    ISHIZAKA, O
    CHEMICAL & PHARMACEUTICAL BULLETIN, 1968, 16 (12) : 2524 - +
  • [30] NATO in the multi-order world
    Flockhart, Trine
    INTERNATIONAL AFFAIRS, 2024, 100 (02) : 471 - 489