Guaranteed Matrix Completion via Nonconvex Factorization

被引:53
|
作者
Sun, Ruoyu [1 ]
Luo, Zhi-Quan [1 ,2 ]
机构
[1] Univ Minnesota, Minneapolis, MN 55455 USA
[2] Chinese Univ Hong Kong, Shenzhen, Peoples R China
关键词
matrix completion; matrix factorization; nonconvex optimization; perturbation analysis; GRADIENT METHODS; CONVERGENCE; MINIMIZATION; ALGORITHM; POWER;
D O I
10.1109/FOCS.2015.25
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Matrix factorization is a popular approach for large-scale matrix completion. In this approach, the unknown low-rank matrix is expressed as the product of two much smaller matrices so that the low-rank property is automatically fulfilled. The resulting optimization problem, even with huge size, can be solved (to stationary points) very efficiently through standard optimization algorithms such as alternating minimization and stochastic gradient descent (SGD). However, due to the non-convexity caused by the factorization model, there is a limited theoretical understanding of whether these algorithms will generate a good solution. In this paper, we establish a theoretical guarantee for the factorization based formulation to correctly recover the underlying low-rank matrix. In particular, we show that under similar conditions to those in previous works, many standard optimization algorithms converge to the global optima of the factorization based formulation, and recover the true low-rank matrix. A major difference of our work from the existing results is that we do not need resampling (i.e., using independent samples at each iteration) in either the algorithm or its analysis. To the best of our knowledge, our result is the first one that provides exact recovery guarantee for many standard algorithms such as gradient descent, SGD and block coordinate gradient descent.
引用
收藏
页码:270 / 289
页数:20
相关论文
共 50 条
  • [21] Imbalanced low-rank tensor completion via latent matrix factorization
    Qiu, Yuning
    Zhou, Guoxu
    Zeng, Junhua
    Zhao, Qibin
    Xie, Shengli
    NEURAL NETWORKS, 2022, 155 : 369 - 382
  • [22] A Nonconvex Method to Low-Rank Matrix Completion
    He, Haizhen
    Cui, Angang
    Yang, Hong
    Wen, Meng
    IEEE ACCESS, 2022, 10 : 55226 - 55234
  • [23] Multichannel Hankel Matrix Completion Through Nonconvex Optimization
    Zhang, Shuai
    Hao, Yingshuai
    Wang, Meng
    Chow, Joe H.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (04) : 617 - 632
  • [24] Tensor Factorization via Matrix Factorization
    Kuleshov, Volodymyr
    Chaganty, Arun Tejasvi
    Liang, Percy
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 507 - 516
  • [25] Image Completion with Smooth Nonnegative Matrix Factorization
    Sadowski, Tomasz
    Zdunek, Rafal
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING (ICAISC 2018), PT II, 2018, 10842 : 62 - 72
  • [26] Knowledge Base Completion Using Matrix Factorization
    He, Wenqiang
    Feng, Yansong
    Zou, Lei
    Zhao, Dongyan
    WEB TECHNOLOGIES AND APPLICATIONS (APWEB 2015), 2015, 9313 : 256 - 267
  • [27] An Efficient Matrix Factorization Method for Tensor Completion
    Liu, Yuanyuan
    Shang, Fanhua
    IEEE SIGNAL PROCESSING LETTERS, 2013, 20 (04) : 307 - 310
  • [28] Nonconvex Rectangular Matrix Completion via Gradient Descent Without l2,∞ Regularization
    Chen, Ji
    Liu, Dekai
    Li, Xiaodong
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (09) : 5806 - 5841
  • [29] Elastic adversarial deep nonnegative matrix factorization for matrix completion
    Seyedi, Seyed Amjad
    Tab, Fardin Akhlaghian
    Lotfi, Abdulrahman
    Salahian, Navid
    Chavoshinejad, Jovan
    INFORMATION SCIENCES, 2023, 621 : 562 - 579
  • [30] Matrix Completion via Sparse Factorization Solved by Accelerated Proximal Alternating Linearized Minimization
    Fan, Jicong
    Zhao, Mingbo
    Chow, Tommy W. S.
    IEEE TRANSACTIONS ON BIG DATA, 2020, 6 (01) : 119 - 130