The multistability of delayed competitive neural networks with piecewise non-monotonic activation functions

被引:2
|
作者
Zhang, Yan [1 ]
Qiao, Yuanhua [1 ]
Duan, Lijuan [2 ]
Miao, Jun [3 ]
机构
[1] Beijing Univ Technol, Fac Sci, Beijing 100124, Peoples R China
[2] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[3] Beijing Informat Sci & Technol Univ, Sch Comp Sci, Beijing 100101, Peoples R China
基金
中国国家自然科学基金;
关键词
competitive neural networks; multistability; non-monotonic piecewise nonlinear activation functions; time-varying delays; GLOBAL EXPONENTIAL STABILITY; ASSOCIATIVE MEMORY; INSTABILITY; CAPACITY;
D O I
10.1002/mma.8368
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper addresses the problem of multistability of competitive neural networks with nonlinear, non-monotonic piecewise activation functions and time-varying delays. Several sufficient conditions are proposed to guarantee the existence of (2K+1)(n) equilibrium points and the locally exponential stability of (K+1)(n) equilibrium points, where K is a positive integer and determined by the property of activation functions and the parameters of neural networks. The quantitative relationship between the equilibrium points of the system and the zero roots of the bounding functions is given. In addition, the attraction basins of the exponentially stable equilibrium points are obtained. Finally, a numerical simulation is given to illustrate the effectiveness of the obtained results.
引用
收藏
页码:10295 / 10311
页数:17
相关论文
共 50 条
  • [31] Non-monotonic Explanation Functions
    Amgoud, Leila
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, ECSQARU 2021, 2021, 12897 : 19 - 31
  • [32] α­SechSig and α­TanhSig: two novel non-monotonic activation functions
    Cemil Közkurt
    Serhat Kiliçarslan
    Selçuk Baş
    Abdullah Elen
    Soft Computing, 2023, 27 : 18451 - 18467
  • [33] ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions
    Biswas, Koushik
    Kumar, Sandeep
    Banerjee, Shilpak
    Pandey, Ashish Kumar
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6097 - 6105
  • [34] αSechSig and αTanhSig: two novel non-monotonic activation functions
    Kozkurt, Cemil
    Kilicarslan, Serhat
    Bas, Selcuk
    Elen, Abdullah
    SOFT COMPUTING, 2023, 27 (24) : 18451 - 18467
  • [35] Stochastic Neural Networks with Monotonic Activation Functions
    Ravanbakhsh, Siamak
    Poczos, Barnabas
    Schneider, Jeff
    Schuurmans, Dale
    Greiner, Russell
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 809 - 818
  • [36] Multistability and multiperiodicity of high-order competitive neural networks with a general class of activation functions
    Nie, Xiaobing
    Huang, Zhenkun
    NEUROCOMPUTING, 2012, 82 : 1 - 13
  • [37] Multistability of Quaternion-Valued Recurrent Neural Networks with Discontinuous Nonmonotonic Piecewise Nonlinear Activation Functions
    Du, Weihao
    Xiang, Jianglian
    Tan, Manchun
    NEURAL PROCESSING LETTERS, 2023, 55 (05) : 5855 - 5884
  • [38] Multistability of state-dependent switching neural networks with discontinuous nonmonotonic piecewise linear activation functions
    Zhang, Jiahui
    Zhu, Song
    Lu, Nannan
    Wen, Shiping
    NEUROCOMPUTING, 2021, 437 (437) : 300 - 311
  • [39] Multistability of Switched Neural Networks With Piecewise Linear Activation Functions Under State-Dependent Switching
    Guo, Zhenyuan
    Liu, Linlin
    Wang, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (07) : 2052 - 2066
  • [40] Propositional non-monotonic reasoning and inconsistency in symmetric neural networks
    Pinkas, Gadt
    1600, Morgan Kaufmann Publ Inc, San Mateo, CA, USA (01):