L1-Norm Sparse Learning and Its Application

被引:2
|
作者
Zhu Xin-feng [1 ,2 ]
Li Bin [1 ]
Wang Jian-dong [2 ]
机构
[1] Yangzhou Univ, Coll Informat Technol, Yangzhou, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
sparse learning; L-1-norm; bayesian learning; L-1-norm bayesian learning; MCR; BSMCR; REGRESSION; SELECTION;
D O I
10.4028/www.scientific.net/AMM.88-89.379
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The need on finding sparse representations has attracted more and more people to research it. Researchers have developed many approaches (such as nonnegative constraint, 11-norm sparsity regularization and sparse Bayesian learning with independent Gaussian prior) for encouraging sparse solutions and established some conditions under which the feasible solutions could be found by those approaches. This paper commbined the L1-norm regularization and bayesian learning, called L1-norm sparse bayesian learning, which was inspired by RVM (relative vector machine). L1-norm sparse bayesian learning has found its applications in many fields such as MCR (multivariate curve resolution) and so on. We proposed a new method called BSMCR (bayesian sparse MCR) to enhance the quality of resolve result.
引用
收藏
页码:379 / +
页数:2
相关论文
共 50 条
  • [1] Bayesian L1-norm sparse learning
    Lin, Yuanqing
    Lee, Daniel D.
    [J]. 2006 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-13, 2006, : 5463 - 5466
  • [2] DICTIONARY LEARNING FOR SPARSE REPRESENTATION USING WEIGHTED l1-NORM
    Zhao, Haoli
    Ding, Shuxue
    Li, Yujie
    Li, Zhenni
    Li, Xiang
    Tan, Benying
    [J]. 2016 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2016, : 292 - 296
  • [3] Supporting vectors for the l1-norm and the l∞-norm and an application
    Sanchez-Alzola, Alberto
    Garcia-Pacheco, Francisco Javier
    Naranjo-Guerra, Enrique
    Moreno-Pulido, Soledad
    [J]. MATHEMATICAL SCIENCES, 2021, 15 (02) : 173 - 187
  • [4] Reweighted l1-norm penalized LMS for sparse channel estimation and its analysis
    Taheri, Ornid
    Vorobyov, Sergiy A.
    [J]. SIGNAL PROCESSING, 2014, 104 : 70 - 79
  • [5] Sparse portfolio selection via the sorted l1-Norm
    Kremer, Philipp J.
    Lee, Sangkyun
    Bogdan, Malgorzata
    Paterlini, Sandra
    [J]. JOURNAL OF BANKING & FINANCE, 2020, 110
  • [6] BEYOND l1-NORM MINIMIZATION FOR SPARSE SIGNAL RECOVERY
    Mansour, Hassan
    [J]. 2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 337 - 340
  • [7] Sparse index clones via the sorted l1-Norm
    Kremer, Philipp J.
    Brzyski, Damian
    Bogdan, Malgorzata
    Paterlini, Sandra
    [J]. QUANTITATIVE FINANCE, 2022, 22 (02) : 349 - 366
  • [8] Improve robustness of sparse PCA by L1-norm maximization
    Meng, Deyu
    Zhao, Qian
    Xu, Zongben
    [J]. PATTERN RECOGNITION, 2012, 45 (01) : 487 - 497
  • [9] L1-norm plus L2-norm sparse parameter for image recognition
    Feng, Qingxiang
    Zhu, Qi
    Tang, Lin-Lin
    Pan, Jeng-Shyang
    [J]. OPTIK, 2015, 126 (23): : 4078 - 4082
  • [10] Robust Dictionary Learning with Capped l1-Norm
    Jiang, Wenhao
    Nie, Feiping
    Huang, Heng
    [J]. PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3590 - 3596