Improving the generalization capability of the binary CMAC

被引:0
|
作者
Szabó, T [1 ]
Horváth, G [1 ]
机构
[1] Tech Univ Budapest, Dept Measurement & Informat Syst, H-1521 Budapest, Hungary
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper deals with some important questions of the binary CMAC neural networks. CMAC which belongs to the family of feed-forward networks with a single linear trainable layer - has some attractive features. The most important ones are its extremely fast learning capability and the special architecture that lets effective digital hardware implementation possible. Although the CMAC architecture was proposed in the middle of the seventies quite a lot open questions have been left even for today. Among them the most important ones are its modeling and generalization capabilities. While some essential questions of its modeling capability were addressed in the literature no detailed analysis of its generalization properties can be found. This paper shows that the CMAC may have significant generalization error, even in one-dimensional case, where the network can learn any training data set exactly. The paper shows that this generalization error is caused mainly by the training rule of the network. It derives a general expression of the generalization error and proposes a modified training algorithm that helps to reduce this error significantly.
引用
下载
收藏
页码:85 / 90
页数:6
相关论文
共 50 条
  • [31] IMPROVING PROCESS CAPABILITY
    CONNELLY, M
    ADVANCED MATERIALS & PROCESSES, 1994, 146 (03): : 79 - 81
  • [32] Binary Domain Generalization for Sparsifying Binary Neural Networks
    Schiavone, Riccardo
    Galati, Francesco
    Zuluaga, Maria A.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT II, 2023, 14170 : 123 - 140
  • [33] SQWA: STOCHASTIC QUANTIZED WEIGHT AVERAGING FOR IMPROVING THE GENERALIZATION CAPABILITY OF LOW-PRECISION DEEP NEURAL NETWORKS
    Shin, Sungho
    Boo, Yoonho
    Sung, Wonyong
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 8052 - 8056
  • [34] Dimensionality reduction to maximize prediction generalization capability
    Takuya Isomura
    Taro Toyoizumi
    Nature Machine Intelligence, 2021, 3 : 434 - 446
  • [35] Dimensionality reduction to maximize prediction generalization capability
    Isomura, Takuya
    Toyoizumi, Taro
    NATURE MACHINE INTELLIGENCE, 2021, 3 (05) : 434 - 446
  • [36] Assessing Local Generalization Capability in Deep Models
    Wang, Huan
    Keskar, Nitish Shirish
    Xiong, Caiming
    Socher, Richard
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [37] TUNING CAPABILITY OF BINARY PERCEPTRON
    BISWAS, NN
    BHATTACHARYYA, SK
    ELECTRONICS LETTERS, 1991, 27 (13) : 1206 - 1207
  • [38] New generalization of process capability index Cpk
    Pearn, WL
    Chen, KS
    JOURNAL OF APPLIED STATISTICS, 1998, 25 (06) : 801 - 810
  • [39] On representation and generalization capability of pyramid neural networks
    Hoshino, M
    Chao, JH
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1166 - 1171
  • [40] Improving generalization by data categorization
    Li, L
    Pratap, A
    Lin, HT
    Abu-Mostafa, YS
    KNOWLEDGE DISCOVERY IN DATABASES: PKDD 2005, 2005, 3721 : 157 - 168