Probabilistic Lipschitz Analysis of Neural Networks

被引:6
|
作者
Mangal, Ravi [1 ]
Sarangmath, Kartik [1 ]
Nori, Aditya, V [2 ]
Orso, Alessandro [1 ]
机构
[1] Georgia Inst Technol, Atlanta, GA 30332 USA
[2] Microsoft Res, Cambridge CB1 2FB, England
来源
STATIC ANALYSIS (SAS 2020) | 2020年 / 12389卷
关键词
VOLUME; CORRECTNESS; COMPLEXITY;
D O I
10.1007/978-3-030-65474-0_13
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We are interested in algorithmically proving the robustness of neural networks. Notions of robustness have been discussed in the literature; we are interested in probabilistic notions of robustness that assume it feasible to construct a statistical model of the process generating the inputs of a neural network. We find this a reasonable assumption given the rapid advances in algorithms for learning generative models of data. A neural network f is then defined to be probabilistically robust if, for a randomly generated pair of inputs, f is likely to demonstrate k-Lipschitzness, i.e., the distance between the outputs computed by f is upper-bounded by the k(th) multiple of the distance between the pair of inputs. We name this property, probabilistic Lipschitzness. We model generative models and neural networks, together, as programs in a simple, first-order, imperative, probabilistic programming language, pcat. Inspired by a large body of existing literature, we define a denotational semantics for this language. Then we develop a sound local Lipschitzness analysis for cat, a non-probabilistic sublanguage of pcat. This analysis can compute an upper bound of the "Lipschitzness" of a neural network in a bounded region of the input set. We next present a provably correct algorithm, PROLIP, that analyzes the behavior of a neural network in a user-specified box-shaped input region and computes (i) lower bounds on the probabilistic mass of such a region with respect to the generative model, (ii) upper bounds on the Lipschitz constant of the neural network in this region, with the help of the local Lipschitzness analysis. Finally, we present a sketch of a proof-search algorithm that uses PROLIP as a primitive for finding proofs of probabilistic Lipschitzness. We implement the PROLIP algorithm and empirically evaluate the computational complexity of PROLIP.
引用
收藏
页码:274 / 309
页数:36
相关论文
共 50 条
  • [1] Robust Graph Neural Networks via Probabilistic Lipschitz Constraints
    Arghal, Raghu
    Lei, Eric
    Bidokhti, Shirin Saeedi
    [J]. LEARNING FOR DYNAMICS AND CONTROL CONFERENCE, VOL 168, 2022, 168
  • [2] On the Probabilistic Analysis of Neural Networks
    Pasareanu, Corina
    Converse, Hayes
    Filieri, Antonio
    Gopinath, Divya
    [J]. 2020 IEEE/ACM 15TH INTERNATIONAL SYMPOSIUM ON SOFTWARE ENGINEERING FOR ADAPTIVE AND SELF-MANAGING SYSTEMS, SEAMS, 2020, : 5 - 8
  • [3] Multivariate Lipschitz Analysis of the Stability of Neural Networks
    Gupta, Kavya
    Kaakai, Fateh
    Pesquet-Popescu, Beatrice
    Pesquet, Jean-Christophe
    Malliaros, Fragkiskos D.
    [J]. FRONTIERS IN SIGNAL PROCESSING, 2022, 2
  • [4] Probabilistic Symbolic Analysis of Neural Networks
    Converse, Hayes
    Filieri, Antonio
    Gopinath, Divya
    Pasareanu, Corina S.
    [J]. 2020 IEEE 31ST INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING (ISSRE 2020), 2020, : 148 - 159
  • [5] Sequential analysis in Fourier probabilistic neural networks
    V. Savchenko, Andrey
    Belova, Natalya S.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 207
  • [6] Lipschitz regularity of deep neural networks: analysis and efficient estimation
    Scaman, Kevin
    Virmaux, Aladin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [7] PROBABILISTIC NEURAL NETWORKS
    SPECHT, DF
    [J]. NEURAL NETWORKS, 1990, 3 (01) : 109 - 118
  • [9] Lipschitz continuous neural networks on Lp
    Fromion, V
    [J]. PROCEEDINGS OF THE 39TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-5, 2000, : 3528 - 3533
  • [10] Estimation of the Lipschitz norm with neural networks
    Sio, KC
    [J]. NEURAL PROCESSING LETTERS, 1997, 6 (03) : 99 - 108