Variables selection using L0 penalty

被引:0
|
作者
Zhang, Tonglin [1 ]
机构
[1] Purdue Univ, Dept Stat, 150 North Univ St, W Lafayette, IN 47907 USA
关键词
Consistency; Generalized information criterion; Generalized linear models; High -dimensional data; Model size; Penalized maximum likelihood; CENTRAL LIMIT-THEOREMS; TUNING PARAMETER SELECTION; REGRESSION; REGULARIZATION; SUBSET; MODELS; LASSO;
D O I
10.1016/j.csda.2023.107860
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The determination of a tuning parameter by the generalized information criterion (GIC) is considered an important issue in variable selection. It is shown that the GIC and the G0 penalized objective functions are equivalent, leading to a new G0 penalized maximum likelihood method for high-dimensional generalized linear models in this article. Based on the technique of the well-known discrete optimization problem in theoretical computer science, a two-step algorithm for local solutions is proposed. The first step optimizes the G0 penalized objective function under a given model size, where only a maximum likelihood algorithm is needed. The second step optimizes the G0 penalized objective function under a candidate set of model sizes, where only the GIC is needed. As the tuning parameter can be fixed, the selection of the tuning parameter can be ignored in the proposed method. The theoretical study shows that the algorithm is polynomial and any resulting local solution is consistent. Thus, it is not necessary to use the global solution in practice. The numerical studies show that the proposed method outperforms its competitors in general.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] L0 Optimization Using Laplacian Operator for Image Smoothing
    Li M.
    Gao S.
    Han H.
    Zhang C.
    [J]. Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2021, 33 (07): : 1000 - 1014
  • [32] Hidden Convexity in the l0 Pseudonorm
    Chancelier, Jean-Philippe
    De Lara, Michel
    [J]. JOURNAL OF CONVEX ANALYSIS, 2021, 28 (01) : 203 - 236
  • [33] Sparse hyperspectral unmixing using an approximate L0 norm
    Tang, Wei
    Shi, Zhenwei
    Duren, Zhana
    [J]. OPTIK, 2014, 125 (01): : 31 - 38
  • [34] Change Detection Using L0 Smoothing and Superpixel Techniques
    Shi, Xiaoliang
    Xu, Yingying
    Zhang, Guixu
    Shen, Chaomin
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2015, 2015, 9403 : 600 - 611
  • [35] Collaborative Sparse Hyperspectral Unmixing Using l0 Norm
    Shi, Zhenwei
    Shi, Tianyang
    Zhou, Min
    Xu, Xia
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (09): : 5495 - 5508
  • [36] Blind Inpainting Using l0 and Total Variation Regularization
    Afonso, Manya V.
    Raposo Sanches, Joao Miguel
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (07) : 2239 - 2253
  • [37] Blind Image Deblurring using the l0 Gradient Prior
    Anger, Jeremy
    Facciolo, Gabriele
    Delbracio, Mauricio
    [J]. IMAGE PROCESSING ON LINE, 2019, 9 : 124 - 142
  • [38] An Exact Penalty Method for Top-k Multiclass Classification Based on L0 Norm Minimization
    Tan, Haoxian
    [J]. ICMLC 2019: 2019 11TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING, 2019, : 338 - 343
  • [39] Gravity inversion using L0 norm for sparse constraints
    Zhu, Dan
    Hu, Xiangyun
    Liu, Shuang
    Cai, Hongzhu
    Xu, Shan
    Meng, Linghui
    Zhang, Henglei
    [J]. GEOPHYSICAL JOURNAL INTERNATIONAL, 2023, 236 (02) : 904 - 923
  • [40] Lacunary convergence of series in L0
    Drewnowski, L
    Labuda, I
    [J]. PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 1998, 126 (06) : 1655 - 1659