On the generalization ability of recurrent networks

被引:0
|
作者
Hammer, B [1 ]
机构
[1] Univ Osnabruck, Dept Math Comp Sci, D-49069 Osnabruck, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The generalization ability of discrete time partially recurrent networks is examined. It is well known that the VC dimension of recurrent networks is infinite in most interesting cases and hence the standard VC analysis cannot be applied directly. We find guarantees for specific situations where the transition function forms a contraction or the probability of long inputs is restricted. For the general case, we derive posterior bounds which take the input data into account. They are obtained via a generalization of the luckiness framework to the agnostic setting. The general formalism allows to focus on reppresentative parts of the data as well as more general situations such as long term prediction.
引用
收藏
页码:731 / 736
页数:6
相关论文
共 50 条
  • [31] Can SGD Learn Recurrent Neural Networks with Provable Generalization?
    Allen-Zhu, Zeyuan
    Li, Yuanzhi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [32] Effect of complexity on learning ability of recurrent neural networks
    N. Honma
    K. Kitagawa
    K. Abe
    Artificial Life and Robotics, 1998, 2 (3) : 97 - 101
  • [33] Hybrid pooling for enhancement of generalization ability in deep convolutional neural networks
    Tong, Zhiqiang
    Tanaka, Gouhei
    NEUROCOMPUTING, 2019, 333 : 76 - 85
  • [34] Improving generalization ability of multilayer networks by excluding irrelevant input components
    Ishii, M
    Kumazawa, I
    IEEE 2000 ADAPTIVE SYSTEMS FOR SIGNAL PROCESSING, COMMUNICATIONS, AND CONTROL SYMPOSIUM - PROCEEDINGS, 2000, : 203 - 206
  • [35] Role of function complexity and network size in the generalization ability of feedforward networks
    Franco, L
    Jerez, JM
    Bravo, JM
    COMPUTATIONAL INTELLIGENCE AND BIOINSPIRED SYSTEMS, PROCEEDINGS, 2005, 3512 : 1 - 8
  • [36] USE OF SOME SENSITIVITY CRITERIA FOR CHOOSING NETWORKS WITH GOOD GENERALIZATION ABILITY
    DIMOPOULOS, Y
    BOURRET, P
    LEK, S
    NEURAL PROCESSING LETTERS, 1995, 2 (06) : 1 - 4
  • [37] How to improve the generalization ability of multi-layer neural networks
    Sebesta, V
    6TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL VI, PROCEEDINGS: INDUSTRIAL SYSTEMS AND ENGINEERING I, 2002, : 108 - 113
  • [38] Improving Generalization Ability of Deep Neural Networks for Visual Recognition Tasks
    Okatani, Takayuki
    Liu, Xing
    Suganuma, Masanori
    COMPUTATIONAL COLOR IMAGING, CCIW 2019, 2019, 11418 : 3 - 13
  • [39] Enhancing the generalization ability of neural networks through controlling the hidden layers
    Wan, Weishui
    Mabu, Shingo
    Shimada, Kaoru
    Hirasawa, Kotaro
    Hu, Jinglu
    APPLIED SOFT COMPUTING, 2009, 9 (01) : 404 - 414
  • [40] Improved Generalization in Recurrent Neural Networks Using the Tangent Plane Algorithm
    May, P.
    Zhou, E.
    Lee, C. W.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2014, 5 (03) : 118 - 126