Robust subspace clustering via penalized mixture of Gaussians

被引:22
|
作者
Yao, Jing
Cao, Xiangyong
Zhao, Qian [1 ]
Meng, Deyu
Xu, Zongben
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Shaanxi, Peoples R China
关键词
Subspace clustering; Low-rank representation; Mixture of Gaussians; Expectation maximization; SEGMENTATION; FACTORIZATION; FRAMEWORK;
D O I
10.1016/j.neucom.2017.05.102
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many problems in computer vision and pattern recognition can be posed as learning low-dimensional subspace structures from high-dimensional data. Subspace clustering represents a commonly utilized subspace learning strategy. The existing subspace clustering models mainly adopt a deterministic loss function to describe a certain noise type between an observed data matrix and its self-expressed form. However, the noises embedded in practical high-dimensional data are generally non-Gaussian and have much more complex structures. To address this issue, this paper proposes a robust subspace clustering model by embedding the Mixture of Gaussians (MoG) noise modeling strategy into the low-rank representation (LRR) subspace clustering model. The proposed MoG-LRR model is capitalized on its adapting to a wider range of noise distributions beyond current methods due to the universal approximation capability of MoG. Additionally, a penalized likelihood method is encoded into this model to facilitate selecting the number of mixture components automatically. A modified Expectation Maximization (EM) algorithm is also designed to infer the parameters involved in the proposed PMoG-LRR model. The superiority of our method is demonstrated by extensive experiments on face clustering and motion segmentation datasets. (C) 2017 Elsevier B. V. All rights reserved.
引用
收藏
页码:4 / 11
页数:8
相关论文
共 50 条
  • [31] Multiview Subspace Clustering via Co-Training Robust Data Representation
    Liu, Jiyuan
    Liu, Xinwang
    Yang, Yuexiang
    Guo, Xifeng
    Kloft, Marius
    He, Liangzhong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5177 - 5189
  • [32] Robust sequential subspace clustering via 1-norm temporal graph
    Hu, Wenyu
    Li, Shenghao
    Zheng, Weidong
    Lu, Yao
    Yu, Gaohang
    [J]. Neurocomputing, 2022, 383 : 380 - 395
  • [33] Robust Normal Estimation of Point Cloud with Sharp Features via Subspace Clustering
    Luo, Pei
    Wu, Zhuangzhi
    Xia, Chunhe
    Feng, Lu
    Jia, Bo
    [J]. FIFTH INTERNATIONAL CONFERENCE ON GRAPHIC AND IMAGE PROCESSING (ICGIP 2013), 2014, 9069
  • [34] Multi-view spectral clustering via robust local subspace learning
    Feng, Lin
    Cai, Lei
    Liu, Yang
    Liu, Shenglan
    [J]. SOFT COMPUTING, 2017, 21 (08) : 1937 - 1948
  • [35] Robust Affine Subspace Clustering via Smoothed l0-Norm
    Dong, Wenhua
    Wu, Xiao-jun
    [J]. NEURAL PROCESSING LETTERS, 2019, 50 (01) : 785 - 797
  • [36] SUBSPACE CLUSTERING VIA THRESHOLDING AND SPECTRAL CLUSTERING
    Heckel, Reinhard
    Boelcskei, Helmut
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 3263 - 3267
  • [37] Hybrid minimal spanning tree and mixture of gaussians based clustering algorithm
    Vathy-Fogarassy, A
    Kiss, A
    Abonyi, J
    [J]. FOUNDATIONS OF INFORMATION AND KNOWLEDGE SYSTEMS, PROCEEDINGS, 2006, 3861 : 313 - 330
  • [38] SUBSPACE CLUSTERING VIA INDEPENDENT SUBSPACE ANALYSIS NETWORK
    Su, Chunchen
    Wu, Zongze
    Yin, Ming
    Li, KaiXin
    Sun, Weijun
    [J]. 2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 4217 - 4221
  • [39] Robust Point Matching Using Mixture of Asymmetric Gaussians for Nonrigid Transformation
    Wang, Gang
    Wang, Zhicheng
    Zhao, Weidong
    Zhou, Qiangqiang
    [J]. COMPUTER VISION - ACCV 2014, PT IV, 2015, 9006 : 433 - 444
  • [40] Maximum weighted likelihood via rival penalized EM for density mixture clustering with automatic model selection
    Cheung, YM
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2005, 17 (06) : 750 - 761