Fast Bootstrapping Nonparametric Maximum Likelihood for Latent Mixture Models

被引:0
|
作者
Wang, Shijie [1 ]
Shin, Minsuk [2 ]
Bai, Ray [1 ]
机构
[1] Univ South Carolina, Dept Stat, Columbia, SC 29208 USA
[2] Gauss Labs, Palo Alto, CA 94301 USA
基金
美国国家科学基金会;
关键词
Bootstrap/resampling; deep neural network; generative process; mixing density estimation; nonparametric maximum likelihood estimation; two-stage algorithm; CONVEX-OPTIMIZATION; ESTIMATOR;
D O I
10.1109/LSP.2024.3376205
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Estimating the mixing density of a latent mixture model is an important task in signal processing. Nonparametric maximum likelihood estimation is one popular approach to this problem. If the latent variable distribution is assumed to be continuous, then bootstrapping can be used to approximate it. However, traditional bootstrapping requires repeated evaluations on resampled data and is not scalable. In this letter, we construct a generative process to rapidly produce nonparametric maximum likelihood bootstrap estimates. Our method requires only a single evaluation of a novel two-stage optimization algorithm. Simulations and real data analyses demonstrate that our procedure accurately estimates the mixing density with little computational cost even when there are a hundred thousand observations.
引用
收藏
页码:870 / 874
页数:5
相关论文
共 50 条