A STOCHASTIC-MODEL OF NEURAL NETWORK FOR UNSUPERVISED LEARNING

被引:5
|
作者
BENAIM, M
机构
来源
EUROPHYSICS LETTERS | 1992年 / 19卷 / 03期
关键词
GENERAL; THEORETICAL; AND MATHEMATICAL BIOPHYSICS (INC LOGIC OF BIOSYSTEMS; QUANTUM BIOLOGY AND RELEVANT ASPECTS OF THERMODYNAMICS; INFORMATION THEORY; CYBERNETICS AND BIONICS); BIOPHYSICS OF NEUROPHYSIOLOGICAL PROCESSES;
D O I
10.1209/0295-5075/19/3/015
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A two-layered neural network that organizes itself in response to a set of external stimuli is considered. The output layer is a stochastic discrete model of a neural network. The synaptic weights between layers obey a competitive learning law. It is shown that the Long-Term Memory dynamics (weight dynamics) always stabilize by minimizing a Lyapounov function. In the special case where all lateral interactions are uniform, negative and sufficiently strong, this Lyapounov function, within the limit of low noise, is interpreted as the Kullback discrepancy between the probability distribution of the environment and the internal representation of the network.
引用
收藏
页码:241 / 246
页数:6
相关论文
共 50 条