LinSyn: Synthesizing Tight Linear Bounds for Arbitrary Neural Network Activation Functions

被引:6
|
作者
Paulsen, Brandon [1 ]
Wang, Chao [1 ]
机构
[1] Univ Southern Calif, Los Angeles, CA 90089 USA
基金
美国国家科学基金会;
关键词
D O I
10.1007/978-3-030-99524-9_19
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The most scalable approaches to certifying neural network robustness depend on computing sound linear lower and upper bounds for the network's activation functions. Current approaches are limited in that the linear bounds must be handcrafted by an expert, and can be sub-optimal, especially when the network's architecture composes operations using, for example, multiplication such as in LSTMs and the recently popular Swish activation. The dependence on an expert prevents the application of robustness certification to developments in the state-of-the-art of activation functions, and furthermore the lack of tightness guarantees may give a false sense of insecurity about a particular model. To the best of our knowledge, we are the first to consider the problem of automatically synthesizing tight linear bounds for arbitrary n-dimensional activation functions. We propose the first fully automated method that achieves tight linear bounds while only leveraging the mathematical definition of the activation function itself. Our method leverages an efficient heuristic technique to synthesize bounds that are tight and usually sound, and then verifies the soundness (and adjusts the bounds if necessary) using the highly optimized branch-and-bound SMT solver, DREAL. Even though our method depends on an SMT solver, we show that the runtime is reasonable in practice, and, compared with state of the art, our method often achieves 2-5X tighter final output bounds and more than quadruple certified robustness.
引用
收藏
页码:357 / 376
页数:20
相关论文
共 50 条
  • [1] Tight Bounds for Randomized Load Balancing on Arbitrary Network Topologies
    Sauerwald, Thomas
    Sun, He
    2012 IEEE 53RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2012, : 341 - 350
  • [2] Tight bounds on rates of neural-network approximation
    Kurková, V
    Sanguineti, M
    ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS, 2001, 2130 : 277 - 282
  • [3] Lower bounds for in-network computation of arbitrary functions
    Gillani, Iqra Altaf
    Vyavahare, Pooja
    Bagchi, Amitabha
    DISTRIBUTED COMPUTING, 2021, 34 (03) : 181 - 193
  • [4] Lower bounds for in-network computation of arbitrary functions
    Iqra Altaf Gillani
    Pooja Vyavahare
    Amitabha Bagchi
    Distributed Computing, 2021, 34 : 181 - 193
  • [5] THE UPPER ENVELOPE OF PIECEWISE LINEAR FUNCTIONS - TIGHT BOUNDS ON THE NUMBER OF FACES
    EDELSBRUNNER, H
    DISCRETE & COMPUTATIONAL GEOMETRY, 1989, 4 (04) : 337 - 343
  • [6] Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions
    Witt, Carsten
    COMBINATORICS PROBABILITY & COMPUTING, 2013, 22 (02): : 294 - 318
  • [7] Hopfield neural network: The hyperbolic tangent and the piecewise-linear activation functions
    Mathias, Amanda C.
    Rech, Paulo C.
    NEURAL NETWORKS, 2012, 34 : 42 - 45
  • [8] Comparison of ReLU and linear saturated activation functions in neural network for universal approximation
    Stursa, Dominik
    Dolezel, Petr
    PROCEEDINGS OF THE 2019 22ND INTERNATIONAL CONFERENCE ON PROCESS CONTROL (PC19), 2019, : 146 - 151
  • [9] Weight Initialization Possibilities for Feedforward Neural Network with Linear Saturated Activation Functions
    Dolezel, Petr
    Skrabanek, Pavel
    Gago, Lumir
    IFAC PAPERSONLINE, 2016, 49 (25): : 49 - 54
  • [10] The Role of Neural Network Activation Functions
    Parhi, Rahul
    Nowak, Robert D.
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 (1779-1783) : 1779 - 1783