Estimating sparse models from multivariate discrete data via transformed Lasso

被引:0
|
作者
Roos, Teemu [1 ]
Yu, Bin [2 ]
机构
[1] Univ Helsinki, HIIT, FIN-00014 Helsinki, Finland
[2] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
基金
芬兰科学院;
关键词
REGRESSION; SELECTION;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The type of l(1) norm regularization used in Lasso and related methods typically yields sparse parameter estimates where most of the estimates are equal to zero. We study a class of estimators obtained by applying a linear transformation on the parameter vector before evaluating the l(1) norm. The resulting "transformed Lasso" yields estimates that are "smooth" in a way that depends on the applied transformation. The optimization problem is convex and can be solved efficiently using existing tools. We present two examples: the Haar transform which corresponds to variable length Markov chain (context-tree) models, and the Walsh-Hadamard transform which corresponds to linear combinations of XOR (parity) functions of binary input features.
引用
收藏
页码:287 / +
页数:2
相关论文
共 50 条
  • [1] Sparse Markov Source Estimation via Transformed Lasso
    Roos, Teemu
    Yu, Bin
    ITW: 2009 IEEE INFORMATION THEORY WORKSHOP ON NETWORKING AND INFORMATION THEORY, 2009, : 241 - +
  • [2] Variable selection in multivariate linear models for functional data via sparse regularization
    Matsui, Hidetoshi
    Umezu, Yuta
    JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2020, 3 (02) : 453 - 467
  • [3] Variable selection in multivariate linear models for functional data via sparse regularization
    Hidetoshi Matsui
    Yuta Umezu
    Japanese Journal of Statistics and Data Science, 2020, 3 : 453 - 467
  • [4] The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
    Mazumder, Rahul
    Radchenko, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (05) : 3053 - 3075
  • [5] ESTIMATING CONTINUOUS-TIME MODELS ON THE BASIS OF DISCRETE DATA VIA AN EXACT DISCRETE ANALOG
    McCrorie, J. Roderick
    ECONOMETRIC THEORY, 2009, 25 (04) : 1120 - 1137
  • [6] The Contextual Lasso: Sparse Linear Models via Deep Neural Networks
    Thompson, Ryan
    Dezfouli, Amir
    Kohn, Robert
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Convex clustering method for compositional data via sparse group lasso
    Wang, Xiaokang
    Wang, Huiwen
    Wang, Shanshan
    Yuan, Jidong
    NEUROCOMPUTING, 2021, 425 : 23 - 36
  • [8] Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
    Jan Klosa
    Noah Simon
    Pål Olof Westermark
    Volkmar Liebscher
    Dörte Wittenburg
    BMC Bioinformatics, 21
  • [9] Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
    Klosa, Jan
    Simon, Noah
    Westermark, Pal Olof
    Liebscher, Volkmar
    Wittenburg, Doerte
    BMC BIOINFORMATICS, 2020, 21 (01)
  • [10] Estimating Discrete Choice Models with Incomplete Data
    Newman, Jeffrey
    Ferguson, Mark E.
    Garrow, Laurie A.
    TRANSPORTATION RESEARCH RECORD, 2012, (2302) : 130 - 137