Estimating the number of components in a mixture of multilayer perceptrons

被引:5
|
作者
Olteanu, M. [1 ]
Rynkiewicz, J. [1 ]
机构
[1] Univ Paris 01, SAMOS MATISSE CES, UMR 8174, F-75013 Paris, France
关键词
penalized likelihood; Bayesian information criterion (BIC); mixture models; multilayer perceptrons;
D O I
10.1016/j.neucom.2007.12.022
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian information criterion (BIC) criterion is widely used by the neural-network community for model selection tasks, although its convergence properties are not always theoretically established. In this paper we will focus oil estimating the number of components in a mixture of multilayer perceptrons and proving the convergence of the BIC criterion in this frame. The penalized marginal-likelihood for mixture models and hidden Markov models introduced by Keribin [Consistent estimation of the order of mixture models, Sankhya Indian J. Stat. 62 (2000) 49-66] and, respectively, Gassiat [Likelihood ratio inequalities with applications to various mixtures, Ann. Inst. Henri Poincare 38 (2002) 897-906] is extended to mixtures of multilayer perceptrons for which a penalized-likelihood criterion is proposed. We prove its convergence under some hypothesis which involve essentially the bracketing entropy of the generalized score-function class and illustrate it by some numerical examples. (c) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:1321 / 1329
页数:9
相关论文
共 50 条
  • [41] DYNAMIC SIZING OF MULTILAYER PERCEPTRONS
    APOLLONI, B
    RONCHINI, G
    BIOLOGICAL CYBERNETICS, 1994, 71 (01) : 49 - 63
  • [42] ON THE INITIALIZATION AND OPTIMIZATION OF MULTILAYER PERCEPTRONS
    WEYMAERE, N
    MARTENS, JP
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (05): : 738 - 751
  • [43] Multilayer perceptrons on Splash 2
    Ratha, NK
    Jain, AK
    CAMP'97 - FOURTH IEEE INTERNATIONAL WORKSHOP ON COMPUTER ARCHITECTURE FOR MACHINE PERCEPTION, PROCEEDINGS, 1997, : 138 - 142
  • [44] On the weight sparsity of multilayer perceptrons
    Drakopoulos, Georgios
    Megalooikonomou, Vasileios
    2015 6TH INTERNATIONAL CONFERENCE ON INFORMATION, INTELLIGENCE, SYSTEMS AND APPLICATIONS (IISA), 2015,
  • [45] Fast training of multilayer perceptrons
    Verma, B
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (06): : 1314 - 1320
  • [46] DETECTING THE NUMBER OF COMPONENTS IN A FINITE MIXTURE HAVING NORMAL COMPONENTS
    MAINE, M
    BOULLION, T
    RIZZUTO, GT
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1991, 20 (02) : 611 - 620
  • [47] Infinite-dimensional multilayer perceptrons
    Middle East Technical Univ, Ankara, Turkey
    IEEE Trans Neural Networks, 4 (889-896):
  • [48] Entropy minimization algorithm for multilayer perceptrons
    Erdogmus, D
    Principe, JC
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 3003 - 3008
  • [49] Parameter by parameter algorithm for multilayer perceptrons
    Li, YL
    Zhang, D
    Wang, KQ
    NEURAL PROCESSING LETTERS, 2006, 23 (02) : 229 - 242
  • [50] Statistical active learning in multilayer perceptrons
    Fukumizu, K
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (01): : 17 - 26