Multiplicative updates for L1-regularized linear and logistic regression

被引:0
|
作者
Sha, Fei [1 ]
Park, Y. Albert [2 ]
Saul, Lawrence K. [2 ]
机构
[1] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
[2] Univ Calif San Diego, Dept Comp Sci & Engn, La Jolla, CA 92093 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multiplicative update rules have proven useful in many areas of machine learning. Simple to implement, guaranteed to converge, they account in part for the widespread popularity of algorithms such as nonnegative matrix factorization and Expectation-Maximization. In this paper, we show how to derive multiplicative updates for problems in L-1-regularized linear and logistic regression. For L-1-regularized linear regression, the updates are derived by reformulating the required optimization as a problem in nonnegative quadratic programming (NQP). The dual of this problem, itself an instance of NQP, can also be solved using multiplicative updates; moreover, the observed duality gap can be used to bound the error of intermediate solutions. For L-1-regularized logistic regression, we derive similar updates using an iteratively reweighted least squares approach. We present illustrative experimental results and describe efficient implementations for large-scale problems of interest (e.g., with tens of thousands of examples and over one million features).
引用
收藏
页码:13 / +
页数:2
相关论文
共 50 条
  • [1] An Improved GLMNET for L1-regularized Logistic Regression
    Yuan, Guo-Xun
    Ho, Chia-Hua
    Lin, Chih-Jen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 1999 - 2030
  • [2] Distributed Coordinate Descent for L1-regularized Logistic Regression
    Trofimov, Ilya
    Genkin, Alexander
    ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS, AIST 2015, 2015, 542 : 243 - 254
  • [3] Tuning parameter calibration for l1-regularized logistic regression
    Li, Wei
    Lederer, Johannes
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2019, 202 : 80 - 98
  • [4] A Safe Feature Elimination Rule for L1-Regularized Logistic Regression
    Pan, Xianli
    Xu, Yitian
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 4544 - 4554
  • [5] l1-regularized linear regression: persistence and oracle inequalities
    Bartlett, Peter L.
    Mendelson, Shahar
    Neeman, Joseph
    PROBABILITY THEORY AND RELATED FIELDS, 2012, 154 (1-2) : 193 - 224
  • [6] Introducing l1-regularized Logistic Regression in Markov Networks based EDAs
    Luigi, Malago
    Matteo, Matteucci
    Gabriele, Valentini
    2011 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2011, : 1581 - 1588
  • [7] L1-regularized Logistic Regression for Event-driven Stock Market Prediction
    Luo, Si-Shu
    Weng, Yang
    Wang, Wei-Wei
    Hong, Wen-Xing
    2017 12TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND EDUCATION (ICCSE 2017), 2017, : 536 - 541
  • [8] A Fast Hybrid Algorithm for Large-Scale l1-Regularized Logistic Regression
    Shi, Jianing
    Yin, Wotao
    Osher, Stanley
    Sajda, Paul
    JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 713 - 741
  • [9] HIGH-DIMENSIONAL ISING MODEL SELECTION USING l1-REGULARIZED LOGISTIC REGRESSION
    Ravikumar, Pradeep
    Wainwright, Martin J.
    Lafferty, John D.
    ANNALS OF STATISTICS, 2010, 38 (03): : 1287 - 1319
  • [10] An interior-point method for large-scale l1-regularized logistic regression
    Koh, Kwangmoo
    Kim, Seung-Jean
    Boyd, Stephen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2007, 8 : 1519 - 1555