In literature, the idea of kernel machine was introduced to quantile regression, resulting kernel quantile regression (KQR) model, which is capable to fit nonlinear models with flexibility. However, the formulation of KQR leads to a quadratic programming which is computationally expensive to solve. This paper proposes a fast training algorithm for KQR based on majorization-minimization approach, in which an upper bound for the objective function is derived in each iteration which is easier to be minimized. The proposed approach is easy to implement, without requiring any special computing package other than basic linear algebra operations. Numerical studies on simulated and real-world datasets show that, compared to the original quadratic programming based KQR, the proposed approach can achieve essentially the same prediction accuracy with substantially higher time efficiency in training.
机构:
USTC, Dept Stat, Hefei, Peoples R ChinaShenzhen Univ, Coll Math & Stat, Inst Stat Sci, Shenzhen, Peoples R China
Zhang, Weiping
Lin, Hongmei
论文数: 0引用数: 0
h-index: 0
机构:
Shanghai Univ Int Business & Econ, Sch Stat & Informat, Shanghai, Peoples R ChinaShenzhen Univ, Coll Math & Stat, Inst Stat Sci, Shenzhen, Peoples R China
Lin, Hongmei
Lian, Heng
论文数: 0引用数: 0
h-index: 0
机构:
City Univ Hong Kong, Dept Math, Hong Kong, Peoples R ChinaShenzhen Univ, Coll Math & Stat, Inst Stat Sci, Shenzhen, Peoples R China