An Improved Quantum Algorithm for Ridge Regression

被引:71
|
作者
Yu, Chao-Hua [1 ,2 ,3 ,4 ]
Gao, Fei [1 ,5 ]
Wen, Qiao-Yan [1 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[2] State Key Lab Cryptol, POB 5159, Beijing 100878, Peoples R China
[3] Jiangxi Univ Finance & Econ, Sch Informat Technol, Nanchang 330032, Jiangxi, Peoples R China
[4] Univ Western Australia, Sch Phys, Nedlands, WA 6009, Australia
[5] Peng Cheng Lab, Ctr Quantum Comp, Shenzhen 518055, Peoples R China
关键词
Quantum algorithm; ridge regression; regularization hyperparameter; parallel Hamiltonian simulation; quantum K-fold cross validation;
D O I
10.1109/TKDE.2019.2937491
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ridge regression (RR) is an important machine learning technique which introduces a regularization hyperparameter a to ordinary multiple linear regression for analyzing data suffering from multicollinearity. In this paper, we present a quantum algorithm for RR, where the technique of parallel Hamiltonian simulation to simulate a number of Hermitian matrices in parallel is proposed and used to develop a quantum version of K-fold cross-validation approach, which can efficiently estimate the predictive performance of RR. Our algorithm consists of two phases: (1) using quantum K-fold cross-validation to efficiently determine a good a with which RR can achieve good predictive performance, and then (2) generating a quantum state encoding the optimal fitting parameters of RR with such a, which can be further utilized to predict new data. Since indefinite dense Hamiltonian simulation has been adopted as a key subroutine, our algorithm can efficiently handle non-sparse data matrices. It is shown that our algorithm can achieve exponential speedup over the classical counterpart for (low-rank) data matrices with low condition numbers. But when the condition numbers of data matrices are large to be amenable to full or approximately full ranks of data matrices, only polynomial speedup can be achieved.
引用
收藏
页码:858 / 866
页数:9
相关论文
共 50 条
  • [1] Faster quantum ridge regression algorithm for prediction
    Menghan Chen
    Chaohua Yu
    Gongde Guo
    Song Lin
    [J]. International Journal of Machine Learning and Cybernetics, 2023, 14 : 117 - 124
  • [2] Faster quantum ridge regression algorithm for prediction
    Chen, Menghan
    Yu, Chaohua
    Guo, Gongde
    Lin, Song
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (01) : 117 - 124
  • [3] An improved ridge regression algorithm and its application in predicting TV ratings
    Ma, Nan
    Zhao, Sicheng
    Sun, Zhen
    Wu, Xiuping
    Zhai, Yun
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (01) : 525 - 536
  • [4] An improved ridge regression algorithm and its application in predicting TV ratings
    Nan Ma
    Sicheng Zhao
    Zhen Sun
    Xiuping Wu
    Yun Zhai
    [J]. Multimedia Tools and Applications, 2019, 78 : 525 - 536
  • [5] An improved quantum-inspired algorithm for linear regression
    Gilyen, Andras
    Song, Zhao
    Tang, Ewin
    [J]. QUANTUM, 2022, 6
  • [6] Improved ridge regression estimators for the logistic regression model
    Saleh, A. K. Md. E.
    Kibria, B. M. Golam
    [J]. COMPUTATIONAL STATISTICS, 2013, 28 (06) : 2519 - 2558
  • [7] Improved ridge regression estimators for the logistic regression model
    A. K. Md. E. Saleh
    B. M. Golam Kibria
    [J]. Computational Statistics, 2013, 28 : 2519 - 2558
  • [8] A Prediction Method Based on Improved Ridge Regression
    Luo, Huan
    Liu, Yahui
    [J]. PROCEEDINGS OF 2017 8TH IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS 2017), 2017, : 596 - 599
  • [9] RIDGE+ - AN IMPROVED BIASED REGRESSION ROUTINE
    BUSH, AJ
    [J]. BEHAVIOR RESEARCH METHODS & INSTRUMENTATION, 1981, 13 (03): : 361 - 362
  • [10] Improved ridge estimators in a linear regression model
    Liu, Xu-Qing
    Gao, Feng
    Yu, Zhen-Feng
    [J]. JOURNAL OF APPLIED STATISTICS, 2013, 40 (01) : 209 - 220