Fast Bootstrapping Nonparametric Maximum Likelihood for Latent Mixture Models

被引:0
|
作者
Wang, Shijie [1 ]
Shin, Minsuk [2 ]
Bai, Ray [1 ]
机构
[1] Univ South Carolina, Dept Stat, Columbia, SC 29208 USA
[2] Gauss Labs, Palo Alto, CA 94301 USA
基金
美国国家科学基金会;
关键词
Bootstrap/resampling; deep neural network; generative process; mixing density estimation; nonparametric maximum likelihood estimation; two-stage algorithm; CONVEX-OPTIMIZATION; ESTIMATOR;
D O I
10.1109/LSP.2024.3376205
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Estimating the mixing density of a latent mixture model is an important task in signal processing. Nonparametric maximum likelihood estimation is one popular approach to this problem. If the latent variable distribution is assumed to be continuous, then bootstrapping can be used to approximate it. However, traditional bootstrapping requires repeated evaluations on resampled data and is not scalable. In this letter, we construct a generative process to rapidly produce nonparametric maximum likelihood bootstrap estimates. Our method requires only a single evaluation of a novel two-stage optimization algorithm. Simulations and real data analyses demonstrate that our procedure accurately estimates the mixing density with little computational cost even when there are a hundred thousand observations.
引用
收藏
页码:870 / 874
页数:5
相关论文
共 50 条
  • [41] Maximum likelihood estimators: Nonparametric approach
    Huber C.
    Solev V.N.
    Vonta I.
    [J]. Journal of Mathematical Sciences, 2007, 147 (4) : 6975 - 6979
  • [42] Learning mixture models with the regularized latent maximum entropy principle
    Wang, SJ
    Schuurmans, D
    Peng, FC
    Zhao, YX
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (04): : 903 - 916
  • [43] Asymptotic properties of adaptive maximum likelihood estimators in latent variable models
    Bianconcini, Silvia
    [J]. BERNOULLI, 2014, 20 (03) : 1507 - 1531
  • [44] On the convergence of the Monte Carlo maximum likelihood method for latent variable models
    Cappé, O
    Douc, R
    Moulines, E
    Robert, C
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2002, 29 (04) : 615 - 635
  • [45] Comparing mixture estimates by parametric bootstrapping likelihood ratios
    Joel H. Reynolds
    William D. Templin
    [J]. Journal of Agricultural, Biological, and Environmental Statistics, 2004, 9 : 57 - 74
  • [46] Comparing mixture estimates by parametric bootstrapping likelihood ratios
    Reynolds, JH
    Templin, WD
    [J]. JOURNAL OF AGRICULTURAL BIOLOGICAL AND ENVIRONMENTAL STATISTICS, 2004, 9 (01) : 57 - 74
  • [47] Fast and Stable Maximum Likelihood Estimation for Incomplete Multinomial Models
    Zhang, Chenyang
    Yin, Guosheng
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [48] A FAST, ACCURATE APPROXIMATION TO LOG LIKELIHOOD OF GAUSSIAN MIXTURE MODELS
    Dognin, Pierre L.
    Goel, Vaibhava
    Hershey, John R.
    Olsen, Peder A.
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1- 8, PROCEEDINGS, 2009, : 3817 - 3820
  • [49] Maximum likelihood estimation of Gaussian mixture models without matrix operations
    Hien D. Nguyen
    Geoffrey J. McLachlan
    [J]. Advances in Data Analysis and Classification, 2015, 9 : 371 - 394
  • [50] Entropy-type classification maximum likelihood algorithms for mixture models
    Lai, Chien-Yo
    Yang, Miin-Shen
    [J]. SOFT COMPUTING, 2011, 15 (02) : 373 - 381