A fast algorithm for training a class of fuzzy neural networks

被引:0
|
作者
Li, DM [1 ]
Liu, JQ [1 ]
Hu, HZ [1 ]
机构
[1] Harbin Inst Technol, Harbin 150001, Peoples R China
关键词
fuzzy neural networks; training algorithm; least square technique; simplex search algorithm;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A novel fast algorithm for traing a class of fuzzy neural networks (FNN) is studied. The proposed algorithm is named as Least Square-Simplex (LS-Simplex). The algorithm obtains the performance of global convergence and avoids the inherent local convergence when adopting grads algorithm to train FNN. Also it accelerates FNN's training and can be used on-line which is impossible when using genetic algorithm (GA). Compared with grads algorithm and GA, the LS-Simplex owns more accurate precision and faster convergent speed, and the FNN obtained has excellent performance of generalization.
引用
收藏
页码:852 / 856
页数:5
相关论文
共 2 条
  • [1] ON FUZZY MODELING USING FUZZY NEURAL NETWORKS WITH THE BACKPROPAGATION ALGORITHM
    HORIKAWA, S
    FURUHASHI, T
    UCHIKAWA, Y
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (05): : 801 - 806
  • [2] LANGENIELSERC T, 1972, IEEE T CONTROL, V21