Subspace clustering via adaptive-loss regularized representation learning with latent affinities

被引:3
|
作者
Jiang, Kun [1 ,3 ]
Zhu, Lei [1 ,3 ]
Liu, Zheng [1 ,3 ]
Sun, Qindong [2 ,3 ]
机构
[1] Xian Univ Technol, Sch Comp Sci & Engn, Xian, Peoples R China
[2] Xi An Jiao Tong Univ, Sch Cyber Sci & Engn, Xian, Peoples R China
[3] Xian Univ Technol, Shaanxi Key Lab Network Comp & Secur, Xian, Peoples R China
关键词
Subspace clustering; Adaptive reconstruction loss; Latent affinity approximation; Robust neighborhood assignment; NONNEGATIVE LOW-RANK; ROBUST; GRAPH; SPARSE;
D O I
10.1007/s10044-024-01226-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
High-dimensional data that lies on several subspaces tend to be highly correlated and contaminated by various noises, and its affinities across different subspaces are not always reliable, which impedes the effectiveness of subspace clustering. To alleviate the deficiencies, we propose a novel subspace learning model via adaptive-loss regularized representation learning with latent affinities (ALRLA). Specifically, the robust least square regression with nonnegative constraint is firstly proposed to generate more interpretable reconstruction coefficients in low-dimensional subspace and specify the weighted self-representation capability with adaptive loss norm for better robustness and discrimination. Moreover, an adaptive latent graph learning regularizer with an initialized affinity approximation is considered to provide more accurate and robust neighborhood assignment for low-dimensional representations. Finally, the objective model is solved by an alternating optimization algorithm, with theoretical analyses on its convergence and computational complexity. Extensive experiments on benchmark databases demonstrate that the ALRLA model can produce clearer structured representation under redundant and noisy data environment. It achieves competing clustering performance compared with the state-of-the-art clustering models.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Discriminative latent subspace learning with adaptive metric learning
    Ma, Jiajun
    Tang, Yuan Yan
    Shang, Zhaowei
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (04): : 2049 - 2066
  • [32] Discriminative latent subspace learning with adaptive metric learning
    Jiajun Ma
    Yuan Yan Tang
    Zhaowei Shang
    Neural Computing and Applications, 2024, 36 : 2049 - 2066
  • [33] Convex Subspace Clustering by Adaptive Block Diagonal Representation
    Lin, Yunxia
    Chen, Songcan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 10065 - 10078
  • [34] Deep document clustering via adaptive hybrid representation learning
    Ren, Lina
    Qin, Yongbin
    Chen, Yanping
    Lin, Chuan
    Huang, Ruizhang
    KNOWLEDGE-BASED SYSTEMS, 2023, 281
  • [35] Adaptive graph-regularized fixed rank representation for subspace segmentation
    Wei, Lai
    Zhou, Rigui
    Zhu, Changming
    Zhang, Xiafen
    Yin, Jun
    PATTERN ANALYSIS AND APPLICATIONS, 2020, 23 (01) : 443 - 453
  • [36] Adaptive graph-regularized fixed rank representation for subspace segmentation
    Lai Wei
    Rigui Zhou
    Changming Zhu
    Xiafen Zhang
    Jun Yin
    Pattern Analysis and Applications, 2020, 23 : 443 - 453
  • [37] Unsupervised Cross-View Subspace Clustering via Adaptive Contrastive Learning
    Zhang, Zihao
    Wang, Qianqian
    Gao, Quanxue
    Pei, Chengquan
    Feng, Wei
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (05) : 609 - 619
  • [38] Tensor clustering via adaptive subspace iteration
    Peng, Wei
    Li, Tao
    INTELLIGENT DATA ANALYSIS, 2011, 15 (05) : 695 - 713
  • [39] Robust subspace clustering via symmetry constrained latent low rank representation with converted nuclear norm
    Fang, Xian
    Tie, Zhixin
    Song, Feiyang
    Yang, Jialiang
    NEUROCOMPUTING, 2019, 340 : 211 - 221
  • [40] Subspace Clustering via Structured Sparse Relation Representation
    Wei, Lai
    Ji, Fenfen
    Liu, Hao
    Zhou, Rigui
    Zhu, Changming
    Zhang, Xiafen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (09) : 4610 - 4623