Unsupervised learning in second-order neural networks for motion analysis

被引:4
|
作者
Maul, Tomas [1 ]
Baba, Sapiyan [2 ]
机构
[1] Univ Nottingham, Sch Comp Sci, Semenyih, Malaysia
[2] Univ Malaya, Fac Comp Sci & IT, Kuala Lumpur, Malaysia
关键词
Second-order neural networks; Motion analysis; Unsupervised learning; Dendritic computation; Feature correspondences; SELECTIVE GANGLION-CELLS; DIRECTION SELECTIVITY; BIPOLAR CELLS; RETINAL WAVES; MECHANISMS; INFORMATION; DENDRITES; UNITS;
D O I
10.1016/j.neucom.2010.09.023
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper demonstrates how unsupervised learning based on Hebb-like mechanisms is sufficient for training second-order neural networks to perform different types of motion analysis. The paper studies the convergence properties of the network in several conditions, including different levels of noise and motion coherence and different network configurations. We demonstrate the effectiveness of a novel variability dependent learning mechanism, which allows the network to learn under conditions of large feature similarity thresholds, which is crucial for noise robustness. The paper demonstrates the particular relevance of second-order neural networks and therefore correlation based approaches as contributing mechanisms for directional selectivity in the retina. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:884 / 895
页数:12
相关论文
共 50 条
  • [1] Annealing based dynamic learning in second-order neural networks
    Milenkovic, S
    Obradovic, Z
    Litovski, V
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 458 - 463
  • [2] An efficient learning algorithm with second-order convergence for multilayer neural networks
    Ninomiya, H
    Tomita, C
    Asai, H
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 2028 - 2032
  • [3] Second-order Derivative Optimization Methods in Deep Learning Neural Networks
    Lim, Si Yong
    Lim, King Hann
    2022 INTERNATIONAL CONFERENCE ON GREEN ENERGY, COMPUTING AND SUSTAINABLE TECHNOLOGY (GECOST), 2022, : 470 - 475
  • [4] Second-Order Pooling for Graph Neural Networks
    Wang, Zhengyang
    Ji, Shuiwang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) : 6870 - 6880
  • [5] Central neural mechanisms for detecting second-order motion
    Baker, CL
    CURRENT OPINION IN NEUROBIOLOGY, 1999, 9 (04) : 461 - 466
  • [6] The Loss Landscape of Deep Linear Neural Networks: a Second-order Analysis
    Achour, El Mehdi
    Malgouyres, Francois
    Gerchinovitz, Sebastien
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 76
  • [7] Learning and extracting finite state automata with second-order recurrent neural networks
    Giles, C.L.
    Miller, C.B.
    Chen, D.
    Chen, H.H.
    Sun, G.Z.
    Lee, Y.C.
    Neural Computation, 1992, 4 (03)
  • [8] SpikeLM: A second-order supervised learning algorithm for training spiking neural networks
    Wang, Yongji
    Huang, Jian
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES A-MATHEMATICAL ANALYSIS, 2006, 13 : 646 - 650
  • [9] Evapotranspiration Modeling Using Second-Order Neural Networks
    Adamala, Sirisha
    Raghuwanshi, N. S.
    Mishra, Ashok
    Tiwari, Mukesh K.
    JOURNAL OF HYDROLOGIC ENGINEERING, 2014, 19 (06) : 1131 - 1140
  • [10] Generalization of Neural Networks on Second-Order Hypercomplex Numbers
    Pavlov, Stanislav
    Kozlov, Dmitry
    Bakulin, Mikhail
    Zuev, Aleksandr
    Latyshev, Andrey
    Beliaev, Alexander
    MATHEMATICS, 2023, 11 (18)