Synthesization of Multi-valued Associative High-Capacity Memory Based on Continuous Networks with a Class of Non-smooth Linear Nondecreasing Activation Functions

被引:3
|
作者
Sha, Chunlin [1 ]
Zhao, Hongyong [1 ]
Yuan, Yuan [2 ]
Bai, Yuzhen [3 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Dept Math, Nanjing 210016, Jiangsu, Peoples R China
[2] Mem Univ Newfoundland, Dept Math & Stat, St John, NF A1C 5S7, Canada
[3] Qufu Normal Univ, Sch Math Sci, Qufu 273165, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-valued associative memories; External inputs; Design methods; Network dynamics; Globally exponential stability; CELLULAR NEURAL-NETWORKS; DESIGN; MULTISTABILITY;
D O I
10.1007/s11063-018-9955-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel design method for multi-valued auto-associative and heteroassociative memories based on a continuous neural network (CNN) with a class of nonsmooth linear nondecreasing activation functions. The proposed CNN is robust in terms of the design parameter selection, which is dependent on a set of inequalities rather than the learning procedure. Some globally exponentially stable criteria are obtained to ensure multi-valued associative patterns to be retrieved accurately. The methodology, by generating CNN where the input data are fed via external inputs, avoids spurious memory patterns and achieves (2r) n storage capacity. These analytic results are applied to the associative memory of images. The fault-tolerant capability and the effectiveness are validated by illustrative experiments.
引用
收藏
页码:911 / 932
页数:22
相关论文
共 2 条
  • [1] Synthesization of Multi-valued Associative High-Capacity Memory Based on Continuous Networks with a Class of Non-smooth Linear Nondecreasing Activation Functions
    Chunlin Sha
    Hongyong Zhao
    Yuan Yuan
    Yuzhen Bai
    Neural Processing Letters, 2019, 50 : 911 - 932
  • [2] Design of Synthesizing Multi-valued High-Capacity Auto-associative Memories Based on Complex-Valued Networks
    Sha, Chunlin
    Zhao, Hongyong
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 463 - 475