A Scalable Projective Scaling Algorithm for lp Loss With Convex Penalizations

被引:3
|
作者
Zhou, Hongbo [1 ]
Cheng, Qiang [1 ]
机构
[1] So Illinois Univ, Dept Comp Sci, Carbondale, IL 62901 USA
基金
美国国家科学基金会;
关键词
Convex function; Karmarkar's projective scaling condition; l(p) loss function; message passing algorithm (MPA); minimization-majorization (MM); nonconvex; scalability; FACE RECOGNITION; SPARSE; REGRESSION; CONVERGENCE; SHRINKAGE; SELECTION;
D O I
10.1109/TNNLS.2014.2314129
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents an accurate, efficient, and scalable algorithm for minimizing a special family of convex functions, which have a l(p) loss function as an additive component. For this problem, well-known learning algorithms often have well-established results on accuracy and efficiency, but there exists rarely any report on explicit linear scalability with respect to the problem size. The proposed approach starts with developing a second-order learning procedure with iterative descent for general convex penalization functions, and then builds efficient algorithms for a restricted family of functions, which satisfy the Karmarkar's projective scaling condition. Under this condition, a light weight, scalable message passing algorithm (MPA) is further developed by constructing a series of simpler equivalent problems. The proposed MPA is intrinsically scalable because it only involves matrix-vector multiplication and avoids matrix inversion operations. The MPA is proven to be globally convergent for convex formulations; for nonconvex situations, it converges to a stationary point. The accuracy, efficiency, scalability, and applicability of the proposed method are verified through extensive experiments on sparse signal recovery, face image classification, and over-complete dictionary learning problems.
引用
收藏
页码:265 / 276
页数:12
相关论文
共 48 条
  • [41] Solving a nonlinear non-convex trim loss problem with a genetic hybrid algorithm
    Östermark, R
    COMPUTERS & OPERATIONS RESEARCH, 1999, 26 (06) : 623 - 635
  • [42] Strictly convex loss functions for port-Hamiltonian based optimization algorithm for MTDC networks
    Benedito, Ernest
    del Puerto-Flores, Dunstano
    Doria-Cerezo, Arnau
    van der Feltz, Olivier
    Scherpen, Jacquelien M. A.
    2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 7483 - 7488
  • [43] An inertial projective forward-backward-forward algorithm for constrained convex minimization problems and applica-tion to cardiovascular disease prediction
    Cholamjiak, Prasit
    Cholamjiak, Watcharaporn
    Kankam, Kunrada
    JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE-JMCS, 2025, 37 (03): : 347 - 360
  • [44] Convex-structured covariance estimation via the entropy loss under the majorization-minimization algorithm framework
    Chen, Chen
    Chen, Xiangbing
    Ai, Yi
    AIMS MATHEMATICS, 2024, 9 (06): : 14253 - 14273
  • [45] Convex regularized recursive kernel risk-sensitive loss adaptive filtering algorithm and its performance analysis
    Su, Ben-Xue
    Yang, Kun-De
    Wu, Fei-Yun
    Liu, Tian-He
    Yang, Hui-Zhong
    SIGNAL PROCESSING, 2024, 223
  • [46] FUZZY-BASED REAL-CODED GENETIC ALGORITHM FOR OPTIMIZING NON-CONVEX ENVIRONMENTAL ECONOMIC LOSS DISPATCH
    Parihar, Shradha Singh
    Malik, Nitin
    FACTA UNIVERSITATIS-SERIES ELECTRONICS AND ENERGETICS, 2022, 35 (04) : 495 - 512
  • [47] A POLYNOMIAL-TIME PRIMAL-DUAL AFFINE SCALING ALGORITHM FOR LINEAR AND CONVEX QUADRATIC-PROGRAMMING AND ITS POWER-SERIES EXTENSION
    MONTEIRO, RDC
    ADLER, I
    RESENDE, MGC
    MATHEMATICS OF OPERATIONS RESEARCH, 1990, 15 (02) : 191 - 214
  • [48] Texture Analysis Method Based on Fractional Fourier Entropy and Fitness-scaling Adaptive Genetic Algorithm for Detecting Left-sided and Right-sided Sensorineural Hearing Loss
    Wang, Shuihua
    Yang, Ming
    Li, Jianwu
    Wu, Xueyan
    Wang, Hainan
    Liu, Bin
    Dong, Zhengchao
    Zhang, Yudong
    FUNDAMENTA INFORMATICAE, 2017, 151 (1-4) : 505 - 521