Progressive Operational Perceptrons

被引:36
|
作者
Kiranyaz, Serkan [1 ]
Ince, Turker [2 ]
Iosifidis, Alexandros [3 ]
Gabbouj, Moncef [3 ]
机构
[1] Qatar Univ, Dept Elect Engn, Doha, Qatar
[2] Izmir Univ Econ, Dept Elect Engn, Izmir, Turkey
[3] Tampere Univ Technol, Dept Signal Proc, Tampere, Finland
关键词
Artificial neural networks; Multi-layer perceptrons; Progressive operational perceptrons; Diversity; Scalability; NETWORK;
D O I
10.1016/j.neucom.2016.10.044
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-) linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain.
引用
收藏
页码:142 / 154
页数:13
相关论文
共 50 条
  • [41] DAMAGE DETECTION UNDER PROGRESSIVE OPERATIONAL DEGRADATION OF STRUCTURES IN REAL TIME
    Bhowmik, Basuraj
    Tripura, Tapas
    Hazra, Budhaditya
    Pakrashi, Vikram
    8TH IOMAC INTERNATIONAL OPERATIONAL MODAL ANALYSIS CONFERENCE, 2019, : 137 - 145
  • [42] Discriminant parallel perceptrons
    González, A
    Cantador, I
    Dorronsoro, JR
    ARTIFICIAL NEURAL NETWORKS: FORMAL MODELS AND THEIR APPLICATIONS - ICANN 2005, PT 2, PROCEEDINGS, 2005, 3697 : 13 - 18
  • [43] Evolving Multilayer Perceptrons
    P. A. Castillo
    J. Carpio
    J. J. Merelo
    A. Prieto
    V. Rivas
    G. Romero
    Neural Processing Letters, 2000, 12 : 115 - 128
  • [44] PERCEPTRONS ABOVE SATURATION
    MAJER, P
    ENGEL, A
    ZIPPELIUS, A
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1993, 26 (24): : 7405 - 7416
  • [45] Perceptrons of large weight
    Podolskii, Vladimir V.
    Computer Science - Theory and Applications, 2007, 4649 : 328 - 336
  • [46] PERCEPTRONS AND MULTILAYER PERCEPTRONS IN SPEECH RECOGNITION - IMPROVEMENTS FROM TEMPORAL WARPING OF THE TRAINING MATERIAL
    KAMMERER, B
    KUPPER, W
    NEURAL NETWORKS FROM MODELS TO APPLICATIONS, 1989, : 531 - 540
  • [47] Enhancing perceptrons with contrastive biclusters
    Coelho, A. L. V.
    de Franca, F. O.
    ELECTRONICS LETTERS, 2016, 52 (24) : 1974 - 1975
  • [48] TECHNIQUES FOR THE MINIMIZATION OF MULTILAYER PERCEPTRONS
    MIRZAI, AR
    HIGGINS, A
    TSAPTSINOS, D
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 1993, 6 (03) : 265 - 277
  • [49] ON THE DECISION REGIONS OF MULTILAYER PERCEPTRONS
    GIBSON, GJ
    COWAN, CFN
    PROCEEDINGS OF THE IEEE, 1990, 78 (10) : 1590 - 1594
  • [50] LEARNING UNLEARNABLE PROBLEMS WITH PERCEPTRONS
    WATKIN, TLH
    RAU, A
    PHYSICAL REVIEW A, 1992, 45 (06): : 4102 - 4110