Efficient and accurate inference for mixtures of Mallows models with Spearman distance

被引:0
|
作者
Marta Crispino
Cristina Mollica
Valerio Astuti
Luca Tardella
机构
[1] Bank of Italy,DG of Economics, Statistics and Research
[2] Sapienza University of Rome,Department of Statistical Sciences
来源
Statistics and Computing | 2023年 / 33卷
关键词
Ranking data; Distance-based models; Model-based clustering; EM algorithm; Censoring;
D O I
暂无
中图分类号
学科分类号
摘要
The Mallows model (MM) occupies a central role in parametric modelling of ranking data to learn preferences of a population of judges. Despite the wide range of metrics for rankings that can be considered in the model specification, the choice is typically limited to the Kendall, Cayley or Hamming distances, due to the closed-form expression of the related model normalizing constant. This work instead focuses on the Mallows model with Spearman distance (MMS). A novel approximation of the normalizing constant is introduced to allow inference even with a large number of items. This allows us to develop and implement an efficient and accurate EM algorithm for estimating finite mixtures of MMS aimed at i) enlarging the applicability to samples drawn from heterogeneous populations, and ii) dealing with partial rankings affected by diverse forms of censoring. These novelties encompass the critical inferential steps that traditionally limited the use of this distance in practice, and render the MMS comparable (or even preferable) to the MMs with other metrics in terms of computational burden. The inferential ability of the EM scheme and the effectiveness of the approximation are assessed by extensive simulation studies. Finally, we show that the application to three real-world datasets endorses our proposals also in the comparison with competing mixtures of ranking models.
引用
收藏
相关论文
共 50 条
  • [1] Efficient and accurate inference for mixtures of Mallows models with Spearman distance
    Crispino, Marta
    Mollica, Cristina
    Astuti, Valerio
    Tardella, Luca
    [J]. STATISTICS AND COMPUTING, 2023, 33 (05)
  • [2] Sampling and Learning Mallows and Generalized Mallows Models Under the Cayley Distance
    Irurozki, Ekhine
    Calvo, Borja
    Lozano, Jose A.
    [J]. METHODOLOGY AND COMPUTING IN APPLIED PROBABILITY, 2018, 20 (01) : 1 - 35
  • [3] Efficiently Learning Mixtures of Mallows Models
    Liu, Allen X.
    Moitra, Ankur
    [J]. 2018 IEEE 59TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2018, : 627 - 638
  • [4] Sampling and Learning Mallows and Generalized Mallows Models Under the Cayley Distance
    Ekhine Irurozki
    Borja Calvo
    Jose A. Lozano
    [J]. Methodology and Computing in Applied Probability, 2018, 20 : 1 - 35
  • [5] Mixtures of Generalized Mallows Models for Solving the Quadratic Assignment Problem
    Ceberio, Josu
    Santana, Roberto
    Mendiburu, Alexander
    Lozano, Jose A.
    [J]. 2015 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2015, : 2050 - 2057
  • [6] Efficient and Accurate Learning of Mixtures of Plackett-Luce Models
    Nguyen, Duc
    Zhang, Anderson Y.
    [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9294 - 9301
  • [7] Kernels of Mallows Models under the Hamming Distance for solving the Quadratic Assignment Problem
    Arza, Etor
    Perez, Aritz
    Irurozki, Ekhine
    Ceberio, Josu
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2020, 59
  • [8] Approaching the Quadratic Assignment Problem with Kernels of Mallows Models under the Hamming Distance
    Arza, Etor
    Ceberio, Josu
    Perez, Aritz
    Irurozki, Ekhine
    [J]. PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, : 141 - 142
  • [9] Accurate Inference for Adaptive Linear Models
    Deshpande, Yash
    Mackey, Lester
    Syrgkanis, Vasilis
    Taddy, Matt
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [10] Concentric Mixtures of Mallows Models for Top-k Rankings: Sampling and Identifiability
    Collas, Fabien
    Irurozki, Ekhine
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139