Small-Variance Asymptotics for Dirichlet Process Mixtures of SVMs

被引:0
|
作者
Wang, Yining [1 ]
Zhu, Jun [2 ]
机构
[1] Tsinghua Univ, Inst Theoret Comp Sci, Inst Interdisciplinary Informat Sci, Beijing, Peoples R China
[2] Tsinghua Univ, TNList Lab, Dept Comp Sci & Technol, State Key Lab Intel Tech & Syst, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
DISTRIBUTIONS; MODELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Infinite SVM (iSVM) is a Dirichlet process (DP) mixture of large-margin classifiers. Though flexible in learning nonlinear classifiers and discovering latent clustering structures, iSVM has a difficult inference task and existing methods could hinder its applicability to large-scale problems. This paper presents a small variance asymptotic analysis to derive a simple and efficient algorithm, which monotonically optimizes a max-margin DP-means ((MDPM)-D-2) problem, an extension of DP-means for both predictive learning and descriptive clustering. Our analysis is built on Gibbs infinite SVMs, an alternative DP mixture of large-margin machines, which admits a partially collapsed Gibbs sampler without truncation by exploring data augmentation techniques. Experimental results show that (MDPM)-D-2 runs much faster than similar algorithms without sacrificing prediction accuracies.
引用
收藏
页码:2135 / 2141
页数:7
相关论文
共 50 条
  • [1] Combinatorial Topic Models using Small-Variance Asymptotics
    Jiang, Ke
    Sra, Suvrit
    Kulis, Brian
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 421 - 429
  • [2] Small-Variance Asymptotics for Bayesian Nonparametric Models with Constraints
    Li, Cheng
    Rana, Santu
    Dinh Phung
    Venkatesh, Svetha
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART II, 2015, 9078 : 92 - 105
  • [3] Small-Variance Asymptotics for Nonparametric Bayesian Overlapping Stochastic Blockmodels
    Arora, Gundeep
    Porwal, Anupreet
    Agarwal, Kanupriya
    Samdariya, Avani
    Rai, Piyush
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2000 - 2006
  • [4] Bayesian Hierarchical Clustering with Exponential Family: Small-Variance Asymptotics and Reducibility
    Lee, Juho
    Choi, Seungjin
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 581 - 589
  • [5] JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes
    Huggins, Jonathan H.
    Narasimhan, Karthik
    Saeedi, Ardavan
    Mansinghka, Vikash K.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 693 - 701
  • [6] Small-variance asymptotics for non-parametric online robot learning
    Tanwani, Ajay Kumar
    Calinon, Sylvain
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2019, 38 (01): : 3 - 22
  • [7] DP-space: Bayesian Nonparametric Subspace Clustering with Small-variance Asymptotics
    Wang, Yining
    Zhu, Jun
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 862 - 870
  • [8] Posterior Asymptotics for Boosted Hierarchical Dirichlet Process Mixtures
    Catalano, Marta
    De Blasi, Pierpaolo
    Lijoi, Antonio
    Prünster, Igor
    [J]. Journal of Machine Learning Research, 2022, 23
  • [9] Posterior Asymptotics for Boosted Hierarchical Dirichlet Process Mixtures
    Catalano, Marta
    De Blasi, Pierpaolo
    Lijoi, Antonio
    Prunster, Igor
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [10] SMALL-VARIANCE ASYMPTOTICS OF HIDDEN POTTS-MRFS: APPLICATION TO FAST BAYESIAN IMAGE SEGMENTATION
    Pereyra, Marcelo
    McLaughlin, Steve
    [J]. 2014 PROCEEDINGS OF THE 22ND EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2014, : 1597 - 1601