Parameter identifiability of a deep feedforward ReLU neural network

被引:0
|
作者
Joachim Bona-Pellissier
François Bachoc
François Malgouyres
机构
[1] Université de Toulouse,Institut de Mathématiques de Toulouse, UMR5219
[2] CNRS,undefined
[3] UPS IMT,undefined
来源
Machine Learning | 2023年 / 112卷
关键词
ReLU networks; Equivalent parameters; Symmetries; Parameter recovery; Deep learning;
D O I
暂无
中图分类号
学科分类号
摘要
The possibility for one to recover the parameters—weights and biases—of a neural network thanks to the knowledge of its function on a subset of the input space can be, depending on the situation, a curse or a blessing. On one hand, recovering the parameters allows for better adversarial attacks and could also disclose sensitive information from the dataset used to construct the network. On the other hand, if the parameters of a network can be recovered, it guarantees the user that the features in the latent spaces can be interpreted. It also provides foundations to obtain formal guarantees on the performances of the network. It is therefore important to characterize the networks whose parameters can be identified and those whose parameters cannot. In this article, we provide a set of conditions on a deep fully-connected feedforward ReLU neural network under which the parameters of the network are uniquely identified—modulo permutation and positive rescaling—from the function it implements on a subset of the input space.
引用
收藏
页码:4431 / 4493
页数:62
相关论文
共 50 条
  • [41] Parameter identification of PEMFC via feedforward neural network-pelican optimization algorithm
    Yang, Bo
    Liang, Boxiao
    Qian, Yucun
    Zheng, Ruyi
    Su, Shi
    Guo, Zhengxun
    Jiang, Lin
    [J]. APPLIED ENERGY, 2024, 361
  • [42] Neural network integral representations with the ReLU activation function
    Petrosyan, Armenak
    Dereventsov, Anton
    Webster, Clayton G.
    [J]. MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 107, 2020, 107 : 128 - 143
  • [43] Identification of hybrid orbital angular momentum modes with deep feedforward neural network
    Huang, Zebin
    Wang, Peipei
    Liu, Junmin
    Xiong, Wenjie
    He, Yanliang
    Zhou, Xinxing
    Xiao, Jiangnan
    Li, Ying
    Chen, Shuqing
    Fan, Dianyuan
    [J]. RESULTS IN PHYSICS, 2019, 15
  • [44] Source term inversion of nuclear accident based on deep feedforward neural network
    Cui, Weijie
    Cao, Bo
    Fan, Qingxu
    Fan, Jin
    Chen, Yixue
    [J]. ANNALS OF NUCLEAR ENERGY, 2022, 175
  • [45] Optimizing Deep Feedforward Neural Network Architecture: A Tabu Search Based Approach
    Gupta, Tarun Kumar
    Raza, Khalid
    [J]. NEURAL PROCESSING LETTERS, 2020, 51 (03) : 2855 - 2870
  • [46] Regularization parameter estimation for feedforward neural networks
    Guo, P
    Lyu, MR
    Chen, CLP
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2003, 33 (01): : 35 - 44
  • [47] Clustering-Based Interpretation of Deep ReLU Network
    Picchiotti, Nicola
    Gori, Marco
    [J]. AIXIA 2021 - ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13196 : 403 - 412
  • [48] Multiscale computation on feedforward neural network and recurrent neural network
    Bin Li
    Xiaoying Zhuang
    [J]. Frontiers of Structural and Civil Engineering, 2020, 14 : 1285 - 1298
  • [49] ComPreEND: Computation Pruning through Predictive Early Negative Detection for ReLU in a Deep Neural Network Accelerator
    Kim, Namhyung
    Park, Hanmin
    Lee, Dongwoo
    Kang, Sungbum
    Lee, Jinho
    Choi, Kiyoung
    [J]. IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (07) : 1537 - 1550
  • [50] Multiscale computation on feedforward neural network and recurrent neural network
    Li, Bin
    Zhuang, Xiaoying
    [J]. FRONTIERS OF STRUCTURAL AND CIVIL ENGINEERING, 2020, 14 (06) : 1285 - 1298