Attacking a Joint Protection Scheme for Deep Neural Network Hardware Accelerators and Models

被引:0
|
作者
Wilhelmstaetter, Simon [1 ]
Conrad, Joschua [1 ]
Upadhyaya, Devanshi [2 ]
Polian, Ilia [2 ]
Ortmanns, Maurits [1 ]
机构
[1] Univ Ulm, Inst Microelect, Albert Einstein Allee 43, Ulm, Germany
[2] Univ Stuttgart, Inst Comp Architecture & Comp Engn, Pfaffenwaldring 47, Stuttgart, Germany
基金
美国国家科学基金会;
关键词
Neural Network; Inference; Accelerator; Logic Locking; Security; Protection Scheme;
D O I
10.1109/AICAS59952.2024.10595935
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The tremendous success of artificial neural networks (NNs) in recent years, paired with the leap of embedded, low-power devices (e.g. IoT, wearables and smart sensors), gave rise to specialized NN accelerators that enable the inference of NNs in power-constrained environments. However, manufacturing or operating such accelerators in un-trusted environments poses risks of undesired model theft and hardware counterfeiting. One way to protect NN hardware against those threats is by locking both the model and the accelerator with secret keys that can only be supplied by entitled authorities (e.g. chip designer or distributor). However, current locking mechanisms contain severe drawbacks, such as required model retraining and vulnerability to the powerful satisfyability checking (SAT)-attack. Recently, an approach for jointly protecting the model and the accelerator was proposed. Compared to previous locking mechanisms, it promises to avoid model retraining, not leak useful model information, and resist the SAT-attack, thereby securing the NN accelerator against counterfeiting and the model against intellectual property infringement. In this paper, those claims are thoroughly evaluated and severe issues in the technical evidence are identified. Furthermore, an attack is developed that does not require an expanded threat model but is still able to completely circumvent all of the proposed protection schemes. It allows to reconstruct all NN model parameters (i.e. model theft) and enables hardware counterfeiting.
引用
收藏
页码:144 / 148
页数:5
相关论文
共 50 条
  • [41] Hardware implementation of deep neural network for seizure prediction
    Massoud, Yasmin M.
    Ahmad, Ahmad A.
    Abdelzaher, Mennatallah
    Kuhlmann, Levin
    Abd El Ghany, Mohamed A.
    AEU-INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATIONS, 2023, 172
  • [42] Efficient Hardware Architectures for Deep Convolutional Neural Network
    Wang, Jichen
    Lin, Jun
    Wang, Zhongfeng
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2018, 65 (06) : 1941 - 1953
  • [43] Deep Neural Network Security From a Hardware Perspective
    Zhou, Tong
    Zhang, Yuheng
    Duan, Shijin
    Luo, Yukui
    Xu, Xiaolin
    2021 IEEE/ACM INTERNATIONAL SYMPOSIUM ON NANOSCALE ARCHITECTURES (NANOARCH), 2021,
  • [44] A Hardware Chaotic Neural Network with Gap Junction Models
    Yamaguchi, Takuto
    Saeki, Katsutoshi
    IEEJ Transactions on Electronics, Information and Systems, 144 (07): : 580 - 587
  • [45] A Hardware Chaotic Neural Network With Gap Junction Models
    Yamaguchi, Takuto
    Saeki, Katsutoshi
    ELECTRONICS AND COMMUNICATIONS IN JAPAN, 2024, 107 (04)
  • [46] Copyright protection of deep neural network models using digital watermarking: a comparative study
    Fkirin, Alaa
    Attiya, Gamal
    El-Sayed, Ayman
    Shouman, Marwa A.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (11) : 15961 - 15975
  • [47] Copyright protection of deep neural network models using digital watermarking: a comparative study
    Alaa Fkirin
    Gamal Attiya
    Ayman El-Sayed
    Marwa A. Shouman
    Multimedia Tools and Applications, 2022, 81 : 15961 - 15975
  • [48] Neural network models and deep learning
    Kriegeskorte, Nikolaus
    Golan, Tal
    CURRENT BIOLOGY, 2019, 29 (07) : R231 - R236
  • [49] Impact of High-Level Synthesis on Reliability of Artificial Neural Network Hardware Accelerators
    Traiola, Marcello
    dos Santos, Fernando Fernandes
    Rech, Paolo
    Cazzaniga, Carlo
    Sentieys, Olivier
    Kritikakou, Angeliki
    IEEE TRANSACTIONS ON NUCLEAR SCIENCE, 2024, 71 (04) : 845 - 853
  • [50] Enhancement of Convolutional Neural Network Hardware Accelerators Efficiency Using Sparsity Optimization Framework
    Kurapati, Hemalatha
    Ramachandran, Sakthivel
    IEEE ACCESS, 2024, 12 : 86034 - 86042