Maximum Likelihood Estimation of a Low-Rank Probability Mass Tensor From Partial Observations

被引:11
|
作者
Yeredor, Arie [1 ]
Haardt, Martin [2 ]
机构
[1] Tel Aviv Univ, Sch Elect Engn, IL-69978 Tel Aviv, Israel
[2] Ilmenau Univ Technol, Commun Res Lab, D-98684 Ilmenau, Germany
关键词
Probability Mass Functions (PMF); Maximum Likelihood (ML); Coupled Tensor Factorization; Kullback-Leibler Divergence; Estimation-Maximization (EM);
D O I
10.1109/LSP.2019.2938663
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider the problem of estimating the Probability Mass Function (PMF) of a discrete random vector (RV) from partial observations, namely when some elements in each observed realization may he missing. Since the PMF takes the form of a multi-way tensor, under certain model assumptions the problem becomes closely associated with tensor factorization. Indeed, in recent studies it was shown that a low-rank PMF tensor can he fully recovered (under some mild conditions) by applying a low-rank (approximate) joint factorization to all estimated joint PMFs of subsets of fixed cardinality larger than two (e.g., triplets). The joint factorization is based on a Least Squares (LS) fit to the estimated lower-order sub-tensors. In this letter we take a different estimation approach by fitting the partial factorization directly to the observed partial data in the sense of Kullback-Leibler divergence (KLD). Consequently, we avoid the need for particular selection and direct estimation of sub-tensors of a particular order, as we inherently apply proper weighting to all the available partial data. We show that our approach essentially attains the Maximum Likelihood estimate of the full PMF tensor (under the low-rank model) and therefore enjoys its well-known properties of consistency and asymptotic efficiency. In addition, based on the Bayesian model interpretation of the low-rank model, we propose an Estimation-Maximization (EM) based approach, which is computationally cheap per iteration. Simulation results demonstrate the advantages of our proposed KLD-based hybrid approach (combining alternating-directions minimization with EM) over LS fitting of sub-tensors.
引用
收藏
页码:1551 / 1555
页数:5
相关论文
共 50 条
  • [1] Low-Rank Matrix Completion Based on Maximum Likelihood Estimation
    Chen, Jinhui
    Yang, Jian
    [J]. 2013 SECOND IAPR ASIAN CONFERENCE ON PATTERN RECOGNITION (ACPR 2013), 2013, : 261 - 265
  • [2] MELT-Maximum-Likelihood Estimation of Low-Rank Toeplitz Covariance Matrix
    Babu, Prabhu
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (11) : 1587 - 1591
  • [3] Maximum-likelihood estimation of low-rank signals for multiepoch MEG/EEG analysis
    Baryshnikov, BV
    Van Veen, BD
    Wakai, RT
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2004, 51 (11) : 1981 - 1993
  • [4] TENSOR QUANTILE REGRESSION WITH LOW-RANK TENSOR TRAIN ESTIMATION
    Liu, Zihuan
    Lee, Cheuk Yin
    Zhang, Heping
    [J]. ANNALS OF APPLIED STATISTICS, 2024, 18 (02): : 1294 - 1318
  • [5] Low-rank tensor methods for partial differential equations
    Bachmayr, Markus
    [J]. ACTA NUMERICA, 2023, 32 : 1 - 121
  • [6] RETRAINING MAXIMUM LIKELIHOOD CLASSIFIERS USING A LOW-RANK MODEL
    Salberg, Arnt-Borre
    [J]. 2011 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2011, : 166 - 169
  • [7] Computation of the maximum likelihood estimator in low-rank factor analysis
    Khamaru, Koulik
    Mazumder, Rahul
    [J]. MATHEMATICAL PROGRAMMING, 2019, 176 (1-2) : 279 - 310
  • [8] Computation of the maximum likelihood estimator in low-rank factor analysis
    Koulik Khamaru
    Rahul Mazumder
    [J]. Mathematical Programming, 2019, 176 : 279 - 310
  • [9] Adaptive Rank Estimation Based Tensor Factorization Algorithm for Low-Rank Tensor Completion
    Liu, Han
    Liu, Jing
    Su, Liyu
    [J]. PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3444 - 3449
  • [10] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143