Structural Watermarking to Deep Neural Networks via Network Channel Pruning

被引:5
|
作者
Zhao, Xiangyu [1 ]
Yao, Yinzhe [1 ]
Wu, Hanzhou [1 ]
Zhang, Xinpeng [1 ]
机构
[1] Shanghai Univ, Shanghai 200444, Peoples R China
基金
中国国家自然科学基金;
关键词
Watermarking; deep neural networks; ownership protection; deep learning; security;
D O I
10.1109/WIFS53200.2021.9648376
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In order to protect the intellectual property (IP) of deep neural networks (DNNs), many existing DNN watermarking techniques either embed watermarks directly into the DNN parameters or insert backdoor watermarks by fine-tuning the DNN parameters, which, however, cannot resist against various attack methods that remove watermarks by altering DNN parameters. In this paper, we bypass such attacks by introducing a structural watermarking scheme that utilizes channel pruning to embed the watermark into the host DNN architecture instead of crafting the DNN parameters. To be specific, during watermark embedding, we prune the internal channels of the host DNN with the channel pruning rates controlled by the watermark. During watermark extraction, the watermark is retrieved by identifying the channel pruning rates from the architecture of the target DNN model. Due to the superiority of pruning mechanism, the performance of the DNN model on its original task is reserved during watermark embedding. Experimental results have shown that, the proposed work enables the embedded watermark to be reliably recovered and provides a sufficient payload, without sacrificing the usability of the DNN model. It is also demonstrated that the proposed work is robust against common transforms and attacks designed for conventional watermarking approaches.
引用
收藏
页码:14 / 19
页数:6
相关论文
共 50 条
  • [31] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [32] ACP: Automatic Channel Pruning Method by Introducing Additional Loss for Deep Neural Networks
    Haoran Yu
    Weiwei Zhang
    Ming Ji
    Chenghui Zhen
    Neural Processing Letters, 2023, 55 : 1071 - 1085
  • [33] Fast Convex Pruning of Deep Neural Networks
    Aghasi, Alireza
    Abdi, Afshin
    Romberg, Justin
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 158 - 188
  • [34] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [35] BNPrune: A Channel Level Pruning Method for Deep Neural Network used of Batch Normalization
    Zhang, Junqi
    Yu, Qinghua
    Peng, Hui
    Zhang, Hui
    2021 6TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM 2021), 2021, : 273 - 277
  • [36] Deep Neural Network Acceleration Based on Low-Rank Approximated Channel Pruning
    Chen, Zhen
    Chen, Zhibo
    Lin, Jianxin
    Liu, Sen
    Li, Weiping
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2020, 67 (04) : 1232 - 1244
  • [37] Pruning the deep neural network by similar function
    Liu, Hanqing
    Xin, Bo
    Mu, Senlin
    Zhu, Zhangqing
    2018 INTERNATIONAL SYMPOSIUM ON POWER ELECTRONICS AND CONTROL ENGINEERING (ISPECE 2018), 2019, 1187
  • [38] Automated Pruning for Deep Neural Network Compression
    Manessi, Franco
    Rozza, Alessandro
    Bianco, Simone
    Napoletano, Paolo
    Schettini, Raimondo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 657 - 664
  • [39] Robust Watermarking for Deep Neural Networks via Bi-level Optimization
    Yang, Peng
    Lao, Yingjie
    Li, Ping
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 14821 - 14830
  • [40] Overview of Deep Convolutional Neural Network Pruning
    Li, Guang
    Liu, Fang
    Xia, Yuping
    2020 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING AND ARTIFICIAL INTELLIGENCE, 2020, 11584