Sparse quantum Gaussian processes to counter the curse of dimensionality

被引:0
|
作者
Gaweł I. Kuś
Sybrand van der Zwaag
Miguel A. Bessa
机构
[1] Delft University of Technology,Novel Aerospace Materials, Faculty of Aerospace Engineering
[2] Delft University of Technology,Materials Science and Engineering
来源
关键词
Gaussian processes; Low-rank approximation; Design of materials; Data-driven design;
D O I
暂无
中图分类号
学科分类号
摘要
Gaussian processes are well-established Bayesian machine learning algorithms with significant merits, despite a strong limitation: lack of scalability. Clever solutions address this issue by inducing sparsity through low-rank approximations, often based on the Nystrom method. Here, we propose a different method to achieve better scalability and higher accuracy using quantum computing, outperforming classical Bayesian neural networks for large datasets significantly. Unlike other approaches to quantum machine learning, the computationally expensive linear algebra operations are not just replaced with their quantum counterparts. Instead, we start from a recent study that proposed a quantum circuit for implementing quantum Gaussian processes and then we use quantum phase estimation to induce a low-rank approximation analogous to that in classical sparse Gaussian processes. We provide evidence through numerical tests, mathematical error bound estimation, and complexity analysis that the method can address the “curse of dimensionality,” where each additional input parameter no longer leads to an exponential growth of the computational cost. This is also demonstrated by applying the algorithm in a practical setting and using it in the data-driven design of a recently proposed metamaterial. The algorithm, however, requires significant quantum computing hardware improvements before quantum advantage can be achieved.
引用
收藏
相关论文
共 50 条
  • [41] The curse of dimensionality in data quality
    Jayawardene, Vimukthi
    Sadiq, Shazia
    Indulska, Marta
    Proceedings of the 24th Australasian Conference on Information Systems, 2013,
  • [42] Can a Hebbian-like learning rule be avoiding the curse of dimensionality in sparse distributed data?
    Osorio, Maria
    Sa-Couto, Luis
    Wichert, Andreas
    BIOLOGICAL CYBERNETICS, 2024, 118 (5-6) : 267 - 276
  • [43] Digital medicine and the curse of dimensionality
    Visar Berisha
    Chelsea Krantsevich
    P. Richard Hahn
    Shira Hahn
    Gautam Dasarathy
    Pavan Turaga
    Julie Liss
    npj Digital Medicine, 4
  • [44] Mitigating the curse of dimensionality: sparse grid characteristics method for optimal feedback control and HJB equations
    Kang, Wei
    Wilcox, Lucas C.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2017, 68 (02) : 289 - 315
  • [45] QUANTUM STOCHASTIC CALCULUS AND QUANTUM GAUSSIAN PROCESSES
    Parthasarathy, K. R.
    INDIAN JOURNAL OF PURE & APPLIED MATHEMATICS, 2015, 46 (06): : 781 - 807
  • [46] Quantum stochastic calculus and quantum Gaussian processes
    K. R. Parthasarathy
    Indian Journal of Pure and Applied Mathematics, 2015, 46 : 781 - 807
  • [47] Adaptive Dimensionality Reduction for Fast Sequential Optimization With Gaussian Processes
    Ghoreishi, Seyede Fatemeh
    Friedman, Samuel
    Allaire, Douglas L.
    JOURNAL OF MECHANICAL DESIGN, 2019, 141 (07)
  • [48] SINGLE-TASK AND MULTITASK SPARSE GAUSSIAN PROCESSES
    Zhu, Jiang
    Sun, Shiliang
    PROCEEDINGS OF 2013 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOLS 1-4, 2013, : 1033 - 1038
  • [49] Validation Based Sparse Gaussian Processes for Ordinal Regression
    Srijith, P. K.
    Shevade, Shirish
    Sundararajan, S.
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT II, 2012, 7664 : 409 - 416
  • [50] Differentially Private Regression and Classification with Sparse Gaussian Processes
    Smith, Michael Thomas
    Alvarez, Mauricio A.
    Lawrence, Neil D.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22