Effective Activation Functions for Homomorphic Evaluation of Deep Neural Networks

被引:28
|
作者
Obla, Srinath [1 ]
Gong, Xinghan [1 ]
Aloufi, Asma [1 ]
Hu, Peizhao [1 ]
Takabi, Daniel [2 ]
机构
[1] Rochester Inst Technol, Dept Comp Sci, Rochester, NY 14623 USA
[2] Georgia State Univ, Dept Comp Sci, Atlanta, GA 30302 USA
来源
IEEE ACCESS | 2020年 / 8卷
关键词
Training; Cryptography; Neurons; Biological neural networks; Task analysis; Approximation methods; Private AI; homomorphic encryption; activation function; deep neural networks;
D O I
10.1109/ACCESS.2020.3017436
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
CryptoNets and subsequent work have demonstrated the capability of homomorphic encryption (HE) in the applications of private artificial intelligence (AI). In convolutional neural networks (CNNs), many computations are linear functions such as the convolution layer which can be homomorphically evaluated. However, there are layers such as the activation layer which is comprised of non-linear functions that cannot be homomorphically evaluated. One of the most commonly used methods is approximating these non-linear functions using low-degree polynomials. However, using the approximated polynomials as activation functions introduces errors which could have a significant impact on accuracy in classification tasks. In this paper, we present a systematic method to construct HE-friendly activation functions for CNNs. We first determine what properties in a good activation function contribute to performance by analyzing commonly used functions such as Rectified Linear Units (ReLU) and Sigmoid. Then, we compare polynomial approximation methods and search for an optimal range of approximation for the polynomial activation. We also propose a novel weighted polynomial approximation method tailored to the output distribution of a batch normalization layer. Finally, we demonstrate the effectiveness of our method using several datasets such as MNIST, FMNIST, CIFAR-10.
引用
收藏
页码:153098 / 153112
页数:15
相关论文
共 50 条
  • [1] Evaluation of Activation Functions in Convolutional Neural Networks for Image Classification Based on Homomorphic Encryption
    Jia, Huixue
    Cai, Daomeng
    Huo, Zhilin
    Wang, Cong
    Zhang, Shibin
    Zhang, Shujun
    Li, Xiaoyu
    Yang, Shan
    [J]. PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND NETWORKS, VOL III, CENET 2023, 2024, 1127 : 343 - 355
  • [2] Fast Homomorphic Evaluation of Deep Discretized Neural Networks
    Bourse, Florian
    Minelli, Michele
    Minihold, Matthias
    Paillier, Pascal
    [J]. ADVANCES IN CRYPTOLOGY - CRYPTO 2018, PT III, 2018, 10993 : 483 - 512
  • [3] Deep Neural Networks with Multistate Activation Functions
    Cai, Chenghao
    Xu, Yanyan
    Ke, Dengfeng
    Su, Kaile
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2015, 2015
  • [4] Activation Functions and Their Characteristics in Deep Neural Networks
    Ding, Bin
    Qian, Huimin
    Zhou, Jun
    [J]. PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 1836 - 1841
  • [5] Comparison and Evaluation of Activation Functions in Term of Gradient Instability in Deep Neural Networks
    Liu, Xin
    Zhou, Jun
    Qin, Huimin
    [J]. PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 3966 - 3971
  • [6] A Formal Characterization of Activation Functions in Deep Neural Networks
    Amrouche, Massi
    Stipanovic, Dusan M.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2153 - 2166
  • [7] Learning Activation Functions in Deep (Spline) Neural Networks
    Bohra, Pakshal
    Campos, Joaquim
    Gupta, Harshit
    Aziznejad, Shayan
    Unser, Michael
    [J]. IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 295 - 309
  • [8] Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions
    Jagtap, Ameya D.
    Shin, Yeonjong
    Kawaguchi, Kenji
    Karniadakis, George Em
    [J]. NEUROCOMPUTING, 2022, 468 : 165 - 180
  • [9] Activation Functions of Deep Neural Networks for Polar Decoding Applications
    Seo, Jihoon
    Lee, Juyul
    Kim, Keunyoung
    [J]. 2017 IEEE 28TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR, AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2017,
  • [10] Empirical Evaluation on Robustness of Deep Convolutional Neural Networks Activation Functions Against Adversarial Perturbation
    Su, Jiawei
    Vargas, Danilo Vasconcellos
    Sakurai, Kouichi
    [J]. 2018 SIXTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING WORKSHOPS (CANDARW 2018), 2018, : 223 - 227