Inference for determinantal point processes without spectral knowledge

被引:0
|
作者
Bardenet, Remi [1 ,2 ]
Titsias, Michalis K. [3 ]
机构
[1] Univ Lille, CNRS, Lille, France
[2] Univ Lille, CRIStAL, UMR 9189, Lille, France
[3] Athens Univ Econ & Business, Dept Informat, Athens, Greece
基金
英国工程与自然科学研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Determinantal point processes (DPPs) are point process models that naturally encode diversity between the points of a given realization, through a positive definite kernel K. DPPs possess desirable properties, such as exact sampling or analyticity of the moments, but learning the parameters of kernel K through likelihood-based inference is not straightforward. First, the kernel that appears in the likelihood is not K, but another kernel L related to K through an often intractable spectral decomposition. This issue is typically bypassed in machine learning by directly parametrizing the kernel L, at the price of some interpretability of the model parameters. We follow this approach here. Second, the likelihood has an intractable normalizing constant, which takes the form of a large determinant in the case of a DPP over a finite set of objects, and the form of a Fredholm determinant in the case of a DPP over a continuous domain. Our main contribution is to derive bounds on the likelihood of a DPP, both for finite and continuous domains. Unlike previous work, our bounds are cheap to evaluate since they do not rely on approximating the spectrum of a large matrix or an operator. Through usual arguments, these bounds thus yield cheap variational inference and moderately expensive exact Markov chain Monte Carlo inference methods for DPPs.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Online MAP Inference of Determinantal Point Processes
    Bhaskara, Aditya
    Karbasi, Amin
    Lattanzi, Silvio
    Zadimoghaddam, Morteza
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [2] Faster Greedy MAP Inference for Determinantal Point Processes
    Han, Insu
    Kambadur, Prabhanjan
    Park, Kyoungsoo
    Shin, Jinwoo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [3] Adaptive estimating function inference for nonstationary determinantal point processes
    Lavancier, Frederic
    Poinas, Arnaud
    Waagepetersen, Rasmus
    SCANDINAVIAN JOURNAL OF STATISTICS, 2021, 48 (01) : 87 - 107
  • [4] Exact sampling of determinantal point processes without eigendecomposition
    Launay, Claire
    Galerne, Bruno
    Desolneux, Agnes
    JOURNAL OF APPLIED PROBABILITY, 2020, 57 (04) : 1198 - 1221
  • [5] Unconstrained MAP Inference, Exponentiated Determinantal Point Processes, and Exponential Inapproximability
    Ohsaka, Naoto
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 154 - +
  • [6] Bayesian Inference for Latent Biologic Structure with Determinantal Point Processes (DPP)
    Xu, Yanxun
    Mueller, Peter
    Telesca, Donatello
    BIOMETRICS, 2016, 72 (03) : 955 - 964
  • [7] Some Inapproximability Results of MAP Inference and Exponentiated Determinantal Point Processes
    Ohsaka, Naoto
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2022, 73 : 709 - 735
  • [8] Some Inapproximability Results of MAP Inference and Exponentiated Determinantal Point Processes
    Ohsaka N.
    Journal of Artificial Intelligence Research, 2022, 73 : 709 - 735
  • [9] One-Pass Algorithms for MAP Inference of Nonsymmetric Determinantal Point Processes
    Reddy, Aravind
    Rossi, Ryan A.
    Song, Zhao
    Rao, Anup
    Mai, Tung
    Lipka, Nedim
    Wu, Gang
    Koh, Eunyee
    Ahmed, Nesreen K.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Kronecker Determinantal Point Processes
    Mariet, Zelda
    Sra, Suvrit
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29