Concept Factorization by Joint Locality-constrained and l2,1-norm Regularization for Image Representation

被引:0
|
作者
Jiang, Wei [1 ]
Zhang, Jie [1 ,2 ]
Zhang, Yongqing [1 ]
机构
[1] Liaoning Normal Univ, Sch Math, Dalian 116029, Peoples R China
[2] Dalian Univ Technol, Sch Control Sci & Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
Concept factorization; feature selection; local constraint; clustering l(2,1)-norm; row sparsity; NONNEGATIVE MATRIX FACTORIZATION; SPARSE REPRESENTATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Concept Factorization (CF) is a variation of Nonnegative Matrix Factorization (NMF) that each basis vector can be expressed by linear combination of the data points, each of which can be represented by a linear combination of all the bases. However, existing techniques could not accurately control over the sparseness. To address this issue, we propose a unified criterion, called Concept Factorization by Joint Locality-constrained and l(2,1)-norm regularization (CF2L), which is designed to simultaneously perform concept factorization and locality constraint as well as to achieve the row sparsity. We reformulate the non-negative local coordinate factorization problem and use l(2,1)-norm on the coefficient matrix to achieve row sparsity, which leads to selecting relevant features. An efficient multiplicative updating procedure is produced, and its convergence is guaranteed theoretically. Experiments on benchmark face recognition data sets demonstrate the effectiveness of our proposed algorithm in comparison to the state-of-the-art approaches.
引用
收藏
页码:85 / 103
页数:19
相关论文
共 50 条
  • [1] Nonnegative matrix factorization by joint locality-constrained and a"" 2,1-norm regularization
    Xing, Ling
    Dong, Hao
    Jiang, Wei
    Tang, Kewei
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (03) : 3029 - 3048
  • [2] Nonnegative matrix factorization by joint locality-constrained and ℓ2,1-norm regularization
    Ling Xing
    Hao Dong
    Wei Jiang
    Kewei Tang
    [J]. Multimedia Tools and Applications, 2018, 77 : 3029 - 3048
  • [3] Revisiting L2,1-Norm Robustness With Vector Outlier Regularization
    Jiang, Bo
    Ding, Chris
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (12) : 5624 - 5629
  • [4] Discriminative Feature Selection via Joint Trace Ratio Criterion and l2,1-norm Regularization
    Jiang, Zhang
    Zhao, Mingbo
    Kong, Weijian
    [J]. 2018 IEEE SYMPOSIUM ON PRODUCT COMPLIANCE ENGINEERING - ASIA 2018 (IEEE ISPCE-CN 2018), 2018, : 27 - 32
  • [5] Random Fourier extreme learning machine with l2,1-norm regularization
    Zhou, Sihang
    Liu, Xinwang
    Liu, Qiang
    Wang, Siqi
    Zhu, Chengzhang
    Yin, Jianping
    [J]. NEUROCOMPUTING, 2016, 174 : 143 - 153
  • [6] Robust Neighborhood Preserving Projection by Nuclear/L2,1-Norm Regularization for Image Feature Extraction
    Zhang, Zhao
    Li, Fanzhang
    Zhao, Mingbo
    Zhang, Li
    Yan, Shuicheng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (04) : 1607 - 1622
  • [7] Adaptive Neighborhood Propagation by Joint L2,1-norm Regularized Sparse Coding for Representation and Classification
    Jia, Lei
    Zhang, Zhao
    Wang, Lei
    Jiang, Weiming
    Zhao, Mingbo
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 201 - 210
  • [8] Canonical Correlation Analysis With L2,1-Norm for Multiview Data Representation
    Xu, Meixiang
    Zhu, Zhenfeng
    Zhang, Xingxing
    Zhao, Yao
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (11) : 4772 - 4782
  • [9] Fast unsupervised feature selection with anchor graph and l2,1-norm regularization
    Hu, Haojie
    Wang, Rong
    Nie, Feiping
    Yang, Xiaojun
    Yu, Weizhong
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (17) : 22099 - 22113
  • [10] Discriminant Analysis via Joint Euler Transform and l2,1-Norm
    Liao, Shuangli
    Gao, Quanxue
    Yang, Zhaohua
    Chen, Fang
    Nie, Feiping
    Han, Jungong
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (11) : 5668 - 5682