LinSyn: Synthesizing Tight Linear Bounds for Arbitrary Neural Network Activation Functions

被引:6
|
作者
Paulsen, Brandon [1 ]
Wang, Chao [1 ]
机构
[1] Univ Southern Calif, Los Angeles, CA 90089 USA
基金
美国国家科学基金会;
关键词
D O I
10.1007/978-3-030-99524-9_19
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The most scalable approaches to certifying neural network robustness depend on computing sound linear lower and upper bounds for the network's activation functions. Current approaches are limited in that the linear bounds must be handcrafted by an expert, and can be sub-optimal, especially when the network's architecture composes operations using, for example, multiplication such as in LSTMs and the recently popular Swish activation. The dependence on an expert prevents the application of robustness certification to developments in the state-of-the-art of activation functions, and furthermore the lack of tightness guarantees may give a false sense of insecurity about a particular model. To the best of our knowledge, we are the first to consider the problem of automatically synthesizing tight linear bounds for arbitrary n-dimensional activation functions. We propose the first fully automated method that achieves tight linear bounds while only leveraging the mathematical definition of the activation function itself. Our method leverages an efficient heuristic technique to synthesize bounds that are tight and usually sound, and then verifies the soundness (and adjusts the bounds if necessary) using the highly optimized branch-and-bound SMT solver, DREAL. Even though our method depends on an SMT solver, we show that the runtime is reasonable in practice, and, compared with state of the art, our method often achieves 2-5X tighter final output bounds and more than quadruple certified robustness.
引用
收藏
页码:357 / 376
页数:20
相关论文
共 50 条
  • [41] Neuroevolutionary based convolutional neural network with adaptive activation functions
    ZahediNasab, Roxana
    Mohseni, Hadis
    NEUROCOMPUTING, 2020, 381 : 306 - 313
  • [42] The impact of activation functions on training and performance of a deep neural network
    Marcu, David C.
    Grava, Cristian
    2021 16TH INTERNATIONAL CONFERENCE ON ENGINEERING OF MODERN ELECTRIC SYSTEMS (EMES), 2021, : 126 - 129
  • [43] All-optical neural network with nonlinear activation functions
    Zuo, Ying
    Li, Bohan
    Zhao, Yujun
    Jiang, Yue
    Chen, You-Chiuan
    Chen, Peng
    Jo, Gyu-Boong
    Liu, Junwei
    Du, Shengwang
    OPTICA, 2019, 6 (09): : 1132 - 1137
  • [44] On Neural Network Activation Functions and Optimizers in Relation to Polynomial Regression
    Pomerat, John
    Segev, Aviv
    Datta, Rituparna
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 6183 - 6185
  • [45] New activation functions for single layer feedforward neural network
    Kocak, Yilmaz
    Siray, Gulesen Ustundag
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 164
  • [46] Verification of LSTM Neural Networks with Non-linear Activation Functions
    Moradkhani, Farzaneh
    Fibich, Connor
    Franzle, Martin
    NASA FORMAL METHODS, NFM 2023, 2023, 13903 : 1 - 15
  • [47] Complete Stability of Neural Networks With Nonmonotonic Piecewise Linear Activation Functions
    Nie, Xiaobing
    Zheng, Wei Xing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2015, 62 (10) : 1002 - 1006
  • [48] Stability of Stochastic Recurrent Neural Networks with Positive Linear Activation Functions
    Liao, Wudai
    Yang, Xuezhao
    Wang, Zhongsheng
    ADVANCES IN NEURAL NETWORKS - ISNN 2009, PT 1, PROCEEDINGS, 2009, 5551 : 279 - +
  • [49] Complex backpropagation neural network using elementary transcendental activation functions
    Kim, T
    Adali, T
    2001 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I-VI, PROCEEDINGS: VOL I: SPEECH PROCESSING 1; VOL II: SPEECH PROCESSING 2 IND TECHNOL TRACK DESIGN & IMPLEMENTATION OF SIGNAL PROCESSING SYSTEMS NEURALNETWORKS FOR SIGNAL PROCESSING; VOL III: IMAGE & MULTIDIMENSIONAL SIGNAL PROCESSING MULTIMEDIA SIGNAL PROCESSING, 2001, : 1281 - 1284
  • [50] Rule Extraction from Artificial Neural Network with Optimized Activation Functions
    Wang Jian-guo
    Yang Jian-hong
    Zhang Wen-xing
    Xu Jin-wu
    2008 3RD INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEM AND KNOWLEDGE ENGINEERING, VOLS 1 AND 2, 2008, : 873 - +