Non-convex low-rank representation combined with rank-one matrix sum for subspace clustering

被引:0
|
作者
Liu, Xiaofang [2 ]
Wang, Jun [1 ]
Cheng, Dansong [1 ]
Shi, Daming [3 ]
Zhang, Yongqiang [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, 92 West Dazhi St, Harbin, Peoples R China
[2] Harbin Inst Technol, Sch Elect Engn & Automat, 92 West Dazhi St, Harbin, Peoples R China
[3] Shenzhen Univ, Coll Comp & Software, 3688 Nanhai Ave, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Subspace clustering; Non-convex low-rank representation; Block coordinate descent; Rank-one matrix; DECISION-MAKING; ALGORITHM;
D O I
10.1007/s00500-020-04865-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Exploring the multiple subspace structures of data such as low-rank representation is effective in subspace clustering. Non-convex low-rank representation (NLRR) via matrix factorization is one of the state-of-the-art techniques for subspace clustering. However, NLRR cannot scale to problems with large n (number of samples) as it requires either the inversion of an nxn linear system. To address this issue, we propose a novel approach, NLRR++, which reformulates NLRR as a sum of rank-one components, and apply a column-wise block coordinate descent to update each component iteratively. NLRR++ reduces the time complexity per iteration from O(n3) and the memory complexity from O(n2), where m is the dimensionality and d is the target rank (usually dMUCH LESS-THANmMUCH LESS-THANn). Our experimental results on simulations and real datasets have shown the efficiency and effectiveness of NLRR++. We demonstrate that NLRR++ is not only much faster than NLRR, but also scalable to large datasets such as the ImageNet dataset with 120K samples.
引用
收藏
页码:15317 / 15326
页数:10
相关论文
共 50 条
  • [31] Online learning for low-rank representation and its application in subspace clustering
    Li, Lingzhi
    Zou, Beiji
    Zhang, Xiaoyun
    Journal of Computational Information Systems, 2014, 10 (16): : 7125 - 7135
  • [32] A subspace clustering algorithm based on simultaneously sparse and low-rank representation
    Liu, Xiaolan
    Yi, Miao
    Han, Le
    Deng, Xue
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2017, 33 (01) : 621 - 633
  • [33] Bilateral Fast Low-Rank Representation With Equivalent Transformation for Subspace Clustering
    Shen, Qiangqiang
    Yi, Shuangyan
    Liang, Yongsheng
    Chen, Yongyong
    Liu, Wei
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 6371 - 6383
  • [34] Dual Graph Regularized Latent Low-Rank Representation for Subspace Clustering
    Yin, Ming
    Gao, Junbin
    Lin, Zhouchen
    Shi, Qinfeng
    Guo, Yi
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (12) : 4918 - 4933
  • [35] Adaptive low-rank kernel block diagonal representation subspace clustering
    Maoshan Liu
    Yan Wang
    Jun Sun
    Zhicheng Ji
    Applied Intelligence, 2022, 52 : 2301 - 2316
  • [36] GRAPH-LAPLACIAN CORRELATED LOW-RANK REPRESENTATION FOR SUBSPACE CLUSTERING
    Wang, Tuo
    Cai, Huayue
    Zhang, Xiang
    Lan, Long
    Huang, Xuhui
    Luo, Zhigang
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 3748 - 3752
  • [37] Tensor subspace clustering using consensus tensor low-rank representation
    Cai, Bing
    Lu, Gui-Fu
    INFORMATION SCIENCES, 2022, 609 : 46 - 59
  • [38] Multimodal sparse and low-rank subspace clustering
    Abavisani, Mahdi
    Patel, Vishal M.
    INFORMATION FUSION, 2018, 39 : 168 - 177
  • [39] Low-Rank and Structured Sparse Subspace Clustering
    Zhang, Junjian
    Li, Chun-Guang
    Zhang, Honggang
    Guo, Jun
    2016 30TH ANNIVERSARY OF VISUAL COMMUNICATION AND IMAGE PROCESSING (VCIP), 2016,
  • [40] Low-Rank Sparse Subspace for Spectral Clustering
    Zhu, Xiaofeng
    Zhang, Shichao
    Li, Yonggang
    Zhang, Jilian
    Yang, Lifeng
    Fang, Yue
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (08) : 1532 - 1543