Progressive Operational Perceptrons

被引:36
|
作者
Kiranyaz, Serkan [1 ]
Ince, Turker [2 ]
Iosifidis, Alexandros [3 ]
Gabbouj, Moncef [3 ]
机构
[1] Qatar Univ, Dept Elect Engn, Doha, Qatar
[2] Izmir Univ Econ, Dept Elect Engn, Izmir, Turkey
[3] Tampere Univ Technol, Dept Signal Proc, Tampere, Finland
关键词
Artificial neural networks; Multi-layer perceptrons; Progressive operational perceptrons; Diversity; Scalability; NETWORK;
D O I
10.1016/j.neucom.2016.10.044
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-) linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain.
引用
收藏
页码:142 / 154
页数:13
相关论文
共 50 条
  • [1] Progressive Operational Perceptrons with Memory
    Dat Thanh Tran
    Kiranyaz, Serkan
    Gabbouj, Moncef
    Iosifidis, Alexandros
    NEUROCOMPUTING, 2020, 379 (379) : 172 - 181
  • [2] Generalized Model of Biological Neural Networks: Progressive Operational Perceptrons
    Kiranyaz, Serkan
    Ince, Turker
    Iosifidis, Alexandros
    Gabbouj, Moncef
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2477 - 2485
  • [3] KNOWLEDGE TRANSFER FOR FACE VERIFICATION USING HETEROGENEOUS GENERALIZED OPERATIONAL PERCEPTRONS
    Dat Thanh Tran
    Kiranyaz, Serkan
    Gabbouj, Moncef
    Iosifidis, Alexandros
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 1168 - 1172
  • [4] Operational Turbidity Forecast Using Both Recurrent and Feed-Forward Based Multilayer Perceptrons
    Savary, Michael
    Johannet, Anne
    Massei, Nicolas
    Dupont, Jean-Paul
    Hauchard, Emmanuel
    ADVANCES IN TIME SERIES ANALYSIS AND FORECASTING, 2017, : 243 - 256
  • [5] PHYSICALLY OPERATIONAL FORMULATION OF CAGNIARDS PROGRESSIVE WAVE THEORY
    HEWITTDI.C
    GEOPHYSICS, 1972, 37 (04) : 694 - &
  • [6] A progressive approach for processing satellite data by operational research
    Semih Kuter
    Gerhard-Wilhelm Weber
    Zuhal Akyürek
    Operational Research, 2017, 17 : 371 - 393
  • [7] A progressive approach for processing satellite data by operational research
    Kuter, Semih
    Weber, Gerhard-Wilhelm
    Akyurek, Zuhal
    OPERATIONAL RESEARCH, 2017, 17 (02) : 371 - 393
  • [8] Convex perceptrons
    Garcia, Daniel
    Gonzalez, Ana
    Dorronsoro, Jose R.
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2006, PROCEEDINGS, 2006, 4224 : 578 - 585
  • [9] QUANTUM PERCEPTRONS
    LEWENSTEIN, M
    JOURNAL OF MODERN OPTICS, 1994, 41 (12) : 2491 - 2501
  • [10] ARITHMETIC PERCEPTRONS
    CANNAS, SA
    NEURAL COMPUTATION, 1995, 7 (01) : 173 - 181