AUTOMATIC DIFFERENTIATION FOR RIEMANNIAN OPTIMIZATION ON LOW-RANK MATRIX AND TENSOR-TRAIN MANIFOLDS

被引:0
|
作者
NOVIKOV, A. L. E. X. A. N. D. E. R. [1 ,2 ]
RAKHUBA, M. A. X. I. M. [2 ]
OSELEDETS, I. V. A. N. [3 ,4 ]
机构
[1] Russian Acad Sci, Marchuk Inst Numer Math, Moscow 119333, Russia
[2] HSE Univ, Pokrovsky Blvd 11, Moscow 109028, Russia
[3] Skolkovo Inst Sci & Technol, Skolkovo Innovat Ctr, Moscow 121205, Russia
[4] Russian Acad Sci, Marchuk Inst Numer Math, Moscow 119333, Russia
来源
SIAM JOURNAL ON SCIENTIFIC COMPUTING | 2022年 / 44卷 / 02期
关键词
automatic differentiation; Riemannian optimization; low-rank approximation; tensor-train decomposition;
D O I
10.1137/20M1356774
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding low-rank approximations is to use Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between an approximate Riemannian Hessian and a given vector.
引用
收藏
页码:A843 / A869
页数:27
相关论文
共 50 条
  • [1] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    [J]. 2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384
  • [2] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    [J]. BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [3] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    [J]. BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [4] Nonnegative Tensor-Train Low-Rank Approximations of the Smoluchowski Coagulation Equation
    Manzini, Gianmarco
    Skau, Erik
    Truong, Duc P.
    Vangara, Raviteja
    [J]. LARGE-SCALE SCIENTIFIC COMPUTING (LSSC 2021), 2022, 13127 : 342 - 350
  • [5] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    [J]. INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [6] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 77
  • [7] LOW-RANK MATRIX COMPLETION BY RIEMANNIAN OPTIMIZATION
    Vandereycken, Bart
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (02) : 1214 - 1236
  • [8] Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1152 - 1156
  • [9] Exploiting Low-Rank Tensor-Train Deep Neural Networks Based on Riemannian Gradient Descent With Illustrations of Speech Processing
    Qi, Jun
    Yang, Chao-Han Huck
    Chen, Pin-Yu
    Tejedor, Javier
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 633 - 642
  • [10] ROBUST LOW-RANK MATRIX COMPLETION BY RIEMANNIAN OPTIMIZATION
    Cambier, Leopold
    Absil, P-A.
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (05): : S440 - S460