Parameter identifiability of a deep feedforward ReLU neural network

被引:0
|
作者
Joachim Bona-Pellissier
François Bachoc
François Malgouyres
机构
[1] Université de Toulouse,Institut de Mathématiques de Toulouse, UMR5219
[2] CNRS,undefined
[3] UPS IMT,undefined
来源
Machine Learning | 2023年 / 112卷
关键词
ReLU networks; Equivalent parameters; Symmetries; Parameter recovery; Deep learning;
D O I
暂无
中图分类号
学科分类号
摘要
The possibility for one to recover the parameters—weights and biases—of a neural network thanks to the knowledge of its function on a subset of the input space can be, depending on the situation, a curse or a blessing. On one hand, recovering the parameters allows for better adversarial attacks and could also disclose sensitive information from the dataset used to construct the network. On the other hand, if the parameters of a network can be recovered, it guarantees the user that the features in the latent spaces can be interpreted. It also provides foundations to obtain formal guarantees on the performances of the network. It is therefore important to characterize the networks whose parameters can be identified and those whose parameters cannot. In this article, we provide a set of conditions on a deep fully-connected feedforward ReLU neural network under which the parameters of the network are uniquely identified—modulo permutation and positive rescaling—from the function it implements on a subset of the input space.
引用
收藏
页码:4431 / 4493
页数:62
相关论文
共 50 条
  • [1] Parameter identifiability of a deep feedforward ReLU neural network
    Bona-Pellissier, Joachim
    Bachoc, Francois
    Malgouyres, Francois
    [J]. MACHINE LEARNING, 2023, 112 (11) : 4431 - 4493
  • [2] Local Identifiability of Deep ReLU Neural Networks: the Theory
    Bona-Pellissier, Joachim
    Malgouyres, Francois
    Bachoc, Francois
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] Constructive Deep ReLU Neural Network Approximation
    Lukas Herrmann
    Joost A. A. Opschoor
    Christoph Schwab
    [J]. Journal of Scientific Computing, 2022, 90
  • [4] Constructive Deep ReLU Neural Network Approximation
    Herrmann, Lukas
    Opschoor, Joost A. A.
    Schwab, Christoph
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2022, 90 (02)
  • [5] Integrating geometries of ReLU feedforward neural networks
    Liu, Yajing
    Caglar, Turgay
    Peterson, Christopher
    Kirby, Michael
    [J]. FRONTIERS IN BIG DATA, 2023, 6
  • [6] Learning algorithm analysis for deep neural network with ReLu activation functions
    Placzek, Stanislaw
    Placzek, Aleksander
    [J]. COMPUTER APPLICATIONS IN ELECTRICAL ENGINEERING (ZKWE'2018), 2018, 19
  • [7] A Dynamic ReLU on Neural Network
    Si, Jiong
    Harris, Sarah L.
    Yfantis, Evangelos
    [J]. PROCEEDINGS OF THE 2018 IEEE 13TH DALLAS CIRCUITS AND SYSTEMS CONFERENCE (DCAS), 2018,
  • [8] ORDER-PARAMETER EVOLUTION IN A FEEDFORWARD NEURAL-NETWORK
    WONG, KYM
    CAMPBELL, C
    SHERRINGTON, D
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1995, 28 (06): : 1603 - 1614
  • [9] Deep Neural Network Based Predistorter with ReLU Activation for Doherty Power Amplifiers
    Hongyo, Reina
    Egashira, Yoshimasa
    Yamaguchi, Keiichi
    [J]. 2018 ASIA-PACIFIC MICROWAVE CONFERENCE PROCEEDINGS (APMC), 2018, : 959 - 961
  • [10] Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
    Dung, Dinh
    Nguyen, Van Kien
    Pham, Duong Thanh
    [J]. JOURNAL OF COMPLEXITY, 2023, 79