SELF-ASSOCIATION AND HEBBIAN LEARNING IN LINEAR NEURAL NETWORKS

被引:6
|
作者
PALMIERI, F [1 ]
ZHU, J [1 ]
机构
[1] UNIV NAPLES FEDERICO II,DIPARTIMENTO INGN ELETTRON,I-80125 NAPLES,ITALY
来源
基金
美国国家科学基金会;
关键词
D O I
10.1109/72.410360
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study Hebbian learning in linear neural networks with emphasis on the self-association information principle. This criterion, in one-layer networks, leads to the space of the principal components and can be generalized to arbitrary architectures. The self-association paradigm appears to be very promising because it accounts for the fundamental features of Hebbian synaptic learning and generalizes the various techniques proposed for adaptive principal component networks, We also include a set of simulations that compare various neural architectures and algorithms.
引用
收藏
页码:1165 / 1184
页数:20
相关论文
共 50 条