Sparse functional linear models via calibrated concave-convex procedure

被引:0
|
作者
Lee, Young Joo [1 ]
Jeon, Yongho [1 ]
机构
[1] Yonsei Univ, Dept Appl Stat, 50 Yonsei Ro, Seoul 03722, South Korea
基金
新加坡国家研究基金会;
关键词
Functional regression; Variable selection; High-dimensional regression; CCCP-SCAD; Gene expression data; VARIABLE SELECTION; CELL-CYCLE; GENE-EXPRESSION; ADAPTIVE LASSO; REGRESSION; GCR2;
D O I
10.1007/s42952-023-00242-3
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we propose a calibrated ConCave-Convex Procedure (CCCP) for variable selection in high-dimensional functional linear models. The calibrated CCCP approach for the Smoothly Clipped Absolute Deviation (SCAD) penalty is known to produce a consistent solution path with probability converging to one in linear models. We incorporate the SCAD penalty into function-on-scalar regression models and phrase them as a type of group-penalized estimation using a basis expansion approach. We then implement the calibrated CCCP method to solve the nonconvex group-penalized problem. For the tuning procedure, we use the Extended Bayesian Information Criterion (EBIC) to ensure consistency in high-dimensional settings. In simulation studies, we compare the performance of the proposed method with two existing convex-penalized estimators in terms of variable selection consistency and prediction accuracy. Lastly, we apply the method to the gene expression dataset for sparsely estimating the time-varying effects of transcription factors on the regulation of yeast cell cycle genes.
引用
收藏
页码:189 / 207
页数:19
相关论文
共 50 条
  • [41] Locally sparse estimator for functional linear panel models with fixed effects
    Hu, Lixia
    Chen, Baolin
    You, Jinhong
    STATISTICAL PAPERS, 2024, 65 (09) : 5753 - 5773
  • [42] The Contextual Lasso: Sparse Linear Models via Deep Neural Networks
    Thompson, Ryan
    Dezfouli, Amir
    Kohn, Robert
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [43] The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
    Mazumder, Rahul
    Radchenko, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (05) : 3053 - 3075
  • [44] Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean
    Collier, Olivier
    Dalalyan, Arnak S.
    ELECTRONIC JOURNAL OF STATISTICS, 2019, 13 (02): : 2830 - 2864
  • [45] Sparse estimation of historical functional linear models with a nested group bridge approach
    Xun, Xiaolei
    Guan, Tianyu
    Cao, Jiguo
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2022, 50 (04): : 1254 - 1269
  • [46] Sparse Generalized Functional Linear Models for Predicting Treatment Resistance with Longitudinal Data
    Liu, Yashu
    Nie, Zhi
    Li, Qingqin
    Narayan, Vaibhav
    Manji, Husseini
    Ye, Jieping
    Wittenberg, Gayle
    NEUROPSYCHOPHARMACOLOGY, 2014, 39 : S440 - S441
  • [47] SPARSE ESTIMATION OF GENERALIZED LINEAR MODELS (GLM) VIA APPROXIMATED INFORMATION CRITERIA
    Su, Xiaogang
    Fan, Juanjuan
    Levine, Richard A.
    Nunn, Martha E.
    Tsai, Chih-Ling
    STATISTICA SINICA, 2018, 28 (03) : 1561 - 1581
  • [48] Simultaneous wireless information and power transfer optimization via alternating convex-concave procedure with imperfect channel state information
    Allahzadeh, Saeed
    Daneshifar, Ebrahim
    SIGNAL PROCESSING, 2021, 182
  • [49] Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models
    Li, Ning
    Yang, Hu
    STATISTICAL PAPERS, 2021, 62 (02) : 661 - 680
  • [50] Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models
    Ning Li
    Hu Yang
    Statistical Papers, 2021, 62 : 661 - 680