Sparse quantum Gaussian processes to counter the curse of dimensionality

被引:0
|
作者
Gaweł I. Kuś
Sybrand van der Zwaag
Miguel A. Bessa
机构
[1] Delft University of Technology,Novel Aerospace Materials, Faculty of Aerospace Engineering
[2] Delft University of Technology,Materials Science and Engineering
来源
关键词
Gaussian processes; Low-rank approximation; Design of materials; Data-driven design;
D O I
暂无
中图分类号
学科分类号
摘要
Gaussian processes are well-established Bayesian machine learning algorithms with significant merits, despite a strong limitation: lack of scalability. Clever solutions address this issue by inducing sparsity through low-rank approximations, often based on the Nystrom method. Here, we propose a different method to achieve better scalability and higher accuracy using quantum computing, outperforming classical Bayesian neural networks for large datasets significantly. Unlike other approaches to quantum machine learning, the computationally expensive linear algebra operations are not just replaced with their quantum counterparts. Instead, we start from a recent study that proposed a quantum circuit for implementing quantum Gaussian processes and then we use quantum phase estimation to induce a low-rank approximation analogous to that in classical sparse Gaussian processes. We provide evidence through numerical tests, mathematical error bound estimation, and complexity analysis that the method can address the “curse of dimensionality,” where each additional input parameter no longer leads to an exponential growth of the computational cost. This is also demonstrated by applying the algorithm in a practical setting and using it in the data-driven design of a recently proposed metamaterial. The algorithm, however, requires significant quantum computing hardware improvements before quantum advantage can be achieved.
引用
收藏
相关论文
共 50 条
  • [21] Gaussian and sparse processes are limits of generalized Poisson processes
    Fageot, Julien
    Uhlmann, Virginie
    Unser, Michael
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2020, 48 (03) : 1045 - 1065
  • [22] Sparse within Sparse Gaussian Processes using Neighbor Information
    Tran, Gia-Lac
    Milios, Dimitrios
    Michiardi, Pietro
    Filippone, Maurizio
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139 : 7379 - 7389
  • [23] Online anomaly detection with sparse Gaussian processes
    Gu, Minghao
    Fei, Jingjing
    Sun, Shiliang
    NEUROCOMPUTING, 2020, 403 : 383 - 399
  • [24] Dual Parameterization of Sparse Variational Gaussian Processes
    Adam, Vincent
    Chang, Paul E.
    Khan, Mohammad Emtiyaz
    Solin, Arno
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] Sparse Gaussian Processes with Spherical Harmonic Features
    Dutordoir, Vincent
    Durrande, Nicolas
    Hensman, James
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [26] Direct Loss Minimization for Sparse Gaussian Processes
    Wei, Yadi
    Sheth, Rishit
    Khardon, Roni
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [27] Sparse Orthogonal Variational Inference for Gaussian Processes
    Shi, Jiaxin
    Titsias, Michalis K.
    Mnih, Andriy
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [28] Sparse Gaussian processes for solving nonlinear PDEs
    Meng, Rui
    Yang, Xianjin
    JOURNAL OF COMPUTATIONAL PHYSICS, 2023, 490
  • [29] Sparse Gaussian processes using backward elimination
    Bo, Liefeng
    Wang, Ling
    Jiao, Licheng
    ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 1083 - 1088
  • [30] Sparse Gaussian Processes with Spherical Harmonic Features
    Dutordoir, Vincent
    Durrande, Nicolas
    Hensman, James
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,