Unboundedness of Linear Regions of Deep ReLU Neural Networks

被引:0
|
作者
Ponomarchuk, Anton [1 ]
Koutschan, Christoph [1 ]
Moser, Bernhard [2 ]
机构
[1] OAW, Johann Radon Inst Computat & Appl Math RICAM, Linz, Austria
[2] Software Competence Ctr Hagenberg SCCH, Hagenberg, Austria
关键词
Neural network; Unbounded polytope; Linear programming; ReLU activation function;
D O I
10.1007/978-3-031-14343-4_1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent work concerning adversarial attacks on ReLU neural networks has shown that unbounded regions and regions with a sufficiently large volume can be prone to containing adversarial samples. Finding the representation of linear regions and identifying their properties are challenging tasks. In practice, one works with deep neural networks and high-dimensional input data that leads to polytopes represented by an extensive number of inequalities, and hence demanding high computational resources. The approach should be scalable, feasible and numerically stable. We discuss an algorithm that finds the H-representation of each region of a neural network and identifies if the region is bounded or not.
引用
收藏
页码:3 / 10
页数:8
相关论文
共 50 条
  • [1] RELU DEEP NEURAL NETWORKS AND LINEAR FINITE ELEMENTS
    He, Juncai
    Li, Lin
    Xu, Jinchao
    Zheng, Chunyue
    [J]. JOURNAL OF COMPUTATIONAL MATHEMATICS, 2020, 38 (03) : 502 - 527
  • [2] On the Number of Linear Regions of Deep Neural Networks
    Montufar, Guido
    Pascanu, Razvan
    Cho, Kyunghyun
    Bengio, Yoshua
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [3] Locally linear attributes of ReLU neural networks
    Sattelberg, Ben
    Cavalieri, Renzo
    Kirby, Michael
    Peterson, Chris
    Beveridge, Ross
    [J]. FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2023, 6
  • [4] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    [J]. 2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [5] Bounding and Counting Linear Regions of Deep Neural Networks
    Serra, Thiago
    Tjandraatmadja, Christian
    Ramalingam, Srikumar
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [6] A generative model for fBm with deep ReLU neural networks
    Allouche, Michaël
    Girard, Stéphane
    Gobet, Emmanuel
    [J]. Journal of Complexity, 2022, 73
  • [7] A generative model for fBm with deep ReLU neural networks
    Allouche, Michael
    Girard, Stephane
    Gobet, Emmanuel
    [J]. JOURNAL OF COMPLEXITY, 2022, 73
  • [8] On a Fitting of a Heaviside Function by Deep ReLU Neural Networks
    Hagiwara, Katsuyuki
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 59 - 69
  • [9] Provable Robustness of ReLU networks via Maximization of Linear Regions
    Croce, Francesco
    Andriushchenko, Maksym
    Hein, Matthias
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [10] Local Identifiability of Deep ReLU Neural Networks: the Theory
    Bona-Pellissier, Joachim
    Malgouyres, Francois
    Bachoc, Francois
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,