αl1 - βl2 regularization for sparse recovery

被引:9
|
作者
Ding, Liang [1 ]
Han, Weimin [2 ]
机构
[1] Northeast Forestry Univ, Dept Math, Harbin 150040, Heilongjiang, Peoples R China
[2] Univ Iowa, Dept Math, Iowa City, IA 52242 USA
基金
中国国家自然科学基金;
关键词
sparsity regularization; non-convex; non-smooth; generalized conditional gradient; soft threshold algorithm; THRESHOLDING ALGORITHM; MINIMIZATION;
D O I
10.1088/1361-6420/ab34b5
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper presents a novel regularization with a non-convex, non-smooth term of the form with parameters to solve ill-posed linear problems with sparse solutions. We investigate the existence, stability and convergence of the regularized solution. It is shown that this type of regularization is well-posed and yields sparse solutions. Under an appropriate source condition, we get the convergence rate in the -norm for a priori and a posteriori parameter choice rules, respectively. A numerical algorithm is proposed and analyzed based on an iterative threshold strategy with the generalized conditional gradient method. We prove the convergence even though the regularization term is non-smooth and non-convex. The algorithm can easily be implemented because of its simple structure. Some numerical experiments are performed to test the efficiency of the proposed approach. The experiments show that regularization with performs better in comparison with the classical sparsity regularization and can be used as an alternative to the regularizer.
引用
收藏
页数:26
相关论文
共 50 条
  • [31] L1/2 regularization
    ZongBen Xu
    Hai Zhang
    Yao Wang
    XiangYu Chang
    Yong Liang
    [J]. Science China Information Sciences, 2010, 53 : 1159 - 1169
  • [32] WILLETTS 'L1 AND L2'
    CRAIG, R
    [J]. DRAMA, 1976, (120): : 72 - 73
  • [33] L1/2 regularization
    XU ZongBen 1
    2 Department of Mathematics
    3 University of Science and Technology
    [J]. Science China(Information Sciences), 2010, 53 (06) : 1159 - 1169
  • [34] Sentiment Analysis of Tweets by Convolution Neural Network with L1 and L2 Regularization
    Rangra, Abhilasha
    Sehgal, Vivek Kumar
    Shukla, Shailendra
    [J]. ADVANCED INFORMATICS FOR COMPUTING RESEARCH, ICAICR 2018, PT I, 2019, 955 : 355 - 365
  • [35] ON L1 TRANSFER IN L2 COMPREHENSION AND L2 PRODUCTION
    RINGBOM, H
    [J]. LANGUAGE LEARNING, 1992, 42 (01) : 85 - 112
  • [36] Parameter choices for sparse regularization with the l1 norm
    Liu, Qianru
    Wang, Rui
    Xu, Yuesheng
    Yan, Mingsong
    [J]. INVERSE PROBLEMS, 2023, 39 (02)
  • [37] Sparse kernel logistic regression based on L1/2 regularization
    Xu Chen
    Peng ZhiMing
    Jing WenFeng
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2013, 56 (04) : 1 - 16
  • [38] SPARSE REPRESENTATION LEARNING OF DATA BY AUTOENCODERS WITH L1/2 REGULARIZATION
    Li, F.
    Zurada, J. M.
    Wu, W.
    [J]. NEURAL NETWORK WORLD, 2018, 28 (02) : 133 - 147
  • [39] L2 x L2 → L1 boundedness criteria
    Grafakos, Loukas
    He, Danqing
    Slavikova, Lenka
    [J]. MATHEMATISCHE ANNALEN, 2020, 376 (1-2) : 431 - 455
  • [40] Sparse kernel logistic regression based on L1/2 regularization
    XU Chen
    PENG ZhiMing
    JING WenFeng
    [J]. Science China(Information Sciences), 2013, 56 (04) : 75 - 90