Path classification by stochastic linear recurrent neural networks

被引:2
|
作者
Boutaib, Youness [1 ]
Bartolomaeus, Wiebke [1 ]
Nestler, Sandra [2 ,3 ,4 ,5 ]
Rauhut, Holger [1 ]
机构
[1] Rhein Westfal TH Aachen, Chair Math Informat Proc, Pontdriesch 10, D-52062 Aachen, Germany
[2] Julich Res Ctr, Inst Neurosci & Med INM 6, Julich, Germany
[3] Julich Res Ctr, Inst Adv Simulat IAS 6, Julich, Germany
[4] Julich Res Ctr, JARA Inst Brain Struct Funct Relationships INM 10, Julich, Germany
[5] Rhein Westfal TH Aachen, Aachen, Germany
来源
关键词
Recurrent neural networks; Risk bounds; Agnostic PAC learnability; Empirical risk minimisation; Rademacher complexity; Signatures; SYSTEMS;
D O I
10.1186/s13662-022-03686-9
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We investigate the functioning of a classifying biological neural network from the perspective of statistical learning theory, modelled, in a simplified setting, as a continuous-time stochastic recurrent neural network (RNN) with the identity activation function. In the purely stochastic (robust) regime, we give a generalisation error bound that holds with high probability, thus showing that the empirical risk minimiser is the best-in-class hypothesis. We show that RNNs retain a partial signature of the paths they are fed as the unique information exploited for training and classification tasks. We argue that these RNNs are easy to train and robust and support these observations with numerical experiments on both synthetic and real data. We also show a trade-off phenomenon between accuracy and robustness.
引用
收藏
页数:29
相关论文
共 50 条
  • [1] Path classification by stochastic linear recurrent neural networks
    Youness Boutaib
    Wiebke Bartolomaeus
    Sandra Nestler
    Holger Rauhut
    Advances in Continuous and Discrete Models, 2022
  • [2] Stability of Stochastic Recurrent Neural Networks with Positive Linear Activation Functions
    Liao, Wudai
    Yang, Xuezhao
    Wang, Zhongsheng
    ADVANCES IN NEURAL NETWORKS - ISNN 2009, PT 1, PROCEEDINGS, 2009, 5551 : 279 - +
  • [3] Stabilization of stochastic recurrent neural networks
    Sanchez, EN
    Perez, JP
    PROCEEDINGS OF THE 2002 IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT CONTROL, 2002, : 445 - 447
  • [4] Linear Antisymmetric Recurrent Neural Networks
    Moe, Signe
    Remonato, Filippo
    Grotli, Esten I.
    Gravdahl, Jan Tommy
    LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 170 - 178
  • [5] Dynamic analysis of stochastic recurrent neural networks
    Huang, Chuangxia
    He, Yigang
    Chen, Ping
    NEURAL PROCESSING LETTERS, 2008, 27 (03) : 267 - 276
  • [6] Dynamic Analysis of Stochastic Recurrent Neural Networks
    Chuangxia Huang
    Yigang He
    Ping Chen
    Neural Processing Letters, 2008, 27 : 267 - 276
  • [7] Multi-path x-D recurrent neural networks for collaborative image classification
    Gao, Riqiang
    Huo, Yuankai
    Bao, Shunxing
    Tang, Yucheng
    Antic, Sanja L.
    Epstein, Emily S.
    Deppen, Steve
    Paulson, Alexis B.
    Sandler, Kim L.
    Massion, Pierre P.
    Landman, Bennett A.
    NEUROCOMPUTING, 2020, 397 : 48 - 59
  • [8] Recurrent Convolutional Neural Networks for Text Classification
    Lai, Siwei
    Xu, Liheng
    Liu, Kang
    Zhao, Jun
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2267 - 2273
  • [9] Convolutional Recurrent Neural Networks for Text Classification
    Wang, Ruishuang
    Li, Zhao
    Cao, Jian
    Chen, Tong
    Wang, Lei
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [10] Convolutional Recurrent Neural Networks for Electrocardiogram Classification
    Zihlmann, Martin
    Perekrestenko, Dmytro
    Tschannen, Michael
    2017 COMPUTING IN CARDIOLOGY (CINC), 2017, 44