Chimera state in a feed-forward neuronal network

被引:6
|
作者
Feng, Peihua [1 ]
Yang, Jiayi [1 ]
Wu, Ying [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Aerosp Engn, State Key Lab Strength & Vibrat Mech Struct, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
Chimera state; Feed-forward effect; FHN neuron model; COMPUTATION; BRAIN; DELAY;
D O I
10.1007/s11571-022-09928-6
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Feed-forward effect gives rise to synchronization in neuron firing in deep layers of multiple neuronal network. But complete synchronization means the loss of encoding ability. In order to avoid the contradiction, we ask whether partial synchronization (coexistence of disordered and synchronized neuron firing emerges, also called chimera state) as a compromise strategy can achieve in the feed-forward multiple-layer network. The answer is YES. In order to manifest our argument, we design a multi-layer neuronal network in which neurons in every layer are arranged in a ring topology and neuron firing propagates within (intra-) and across (inter-) the multiply layers. Emergence of chimera state and other patterns highly depends on initial condition of neuronal network and strength of feed-forward effect. Chimera state, cluster and synchronization intra- and inter- layers are displayed by sequence through layers when initial values are elaborately chosen to guarantee emergence of chimera state in the first layer. All type of patterns except chimera state propagates down toward deeper layers in different speeds varying with strength of feed-forward effect. If chimera state already exists in every layer, feed-forward effect with strong and moderate strength spoils chimera states in deep layers and they can only survive in first few layers. When the effect is small enough, chimera states will propagate down toward deeper layers. Indeed, chimera states could exist and transit to deeper layers in a regular multiple network under very strict conditions. The results help understanding better the neuron firing propagating and encoding scheme in a feed-forward neuron network.
引用
收藏
页码:1119 / 1130
页数:12
相关论文
共 50 条
  • [31] Modeling parallel feed-forward based compression network
    Shalinie, S. Mercy
    INTERNATIONAL JOURNAL OF PARALLEL EMERGENT AND DISTRIBUTED SYSTEMS, 2006, 21 (04) : 227 - 237
  • [32] Study of Full Interval Feed-forward Neural Network
    Guan Shou-ping
    Liang Rong-ye
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 2652 - 2655
  • [33] A Feed-Forward Neural Network for Solving Stokes Problem
    Baymani, M.
    Effati, S.
    Kerayechian, A.
    ACTA APPLICANDAE MATHEMATICAE, 2011, 116 (01) : 55 - 64
  • [34] A Feed-Forward Neural Network for Solving Stokes Problem
    M. Baymani
    S. Effati
    A. Kerayechian
    Acta Applicandae Mathematicae, 2011, 116
  • [35] A hierarchical feed-forward network for object detection tasks
    Bax, I
    Heidemann, G
    Ritter, H
    INDEPENDENT COMPONENT ANALYSES, WAVELETS, UNSUPERVISED SMART SENSORS, AND NEURAL NETWORKS III, 2005, 5818 : 144 - 152
  • [36] Using feed-forward networks to infer the activity of feed-back neuronal networks
    Xinxian Huang
    Farzan Nadim
    Amitabha Bose
    BMC Neuroscience, 11 (Suppl 1)
  • [37] Weak Quasiperiodic Signal Propagation through Multilayer Feed-Forward Hodgkin-Huxley Neuronal Network
    Yao, Yuangen
    Gong, Bowen
    Lu, Daxiang
    Gui, Rong
    COMPLEXITY, 2020, 2020
  • [38] Limits to the development of feed-forward structures in large recurrent neuronal networks
    Kunkel, Susanne
    Diesmann, Markus
    Morrison, Abigail
    Frontiers in Computational Neuroscience, 2010, 4
  • [39] Intrinsic excitability state of local neuronal population modulates signal propagation in feed-forward neural networks
    Han, Ruixue
    Wang, Jiang
    Yu, Haitao
    Deng, Bin
    Wei, Xilei
    Qin, Yingmei
    Wang, Haixu
    CHAOS, 2015, 25 (04)
  • [40] Limits to the development of feed-forward structures in large recurrent neuronal networks
    Kunkel, Susanne
    Diesmann, Markus
    Morrison, Abigail
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2011, 4