Guaranteed Matrix Completion via Nonconvex Factorization

被引:53
|
作者
Sun, Ruoyu [1 ]
Luo, Zhi-Quan [1 ,2 ]
机构
[1] Univ Minnesota, Minneapolis, MN 55455 USA
[2] Chinese Univ Hong Kong, Shenzhen, Peoples R China
关键词
matrix completion; matrix factorization; nonconvex optimization; perturbation analysis; GRADIENT METHODS; CONVERGENCE; MINIMIZATION; ALGORITHM; POWER;
D O I
10.1109/FOCS.2015.25
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Matrix factorization is a popular approach for large-scale matrix completion. In this approach, the unknown low-rank matrix is expressed as the product of two much smaller matrices so that the low-rank property is automatically fulfilled. The resulting optimization problem, even with huge size, can be solved (to stationary points) very efficiently through standard optimization algorithms such as alternating minimization and stochastic gradient descent (SGD). However, due to the non-convexity caused by the factorization model, there is a limited theoretical understanding of whether these algorithms will generate a good solution. In this paper, we establish a theoretical guarantee for the factorization based formulation to correctly recover the underlying low-rank matrix. In particular, we show that under similar conditions to those in previous works, many standard optimization algorithms converge to the global optima of the factorization based formulation, and recover the true low-rank matrix. A major difference of our work from the existing results is that we do not need resampling (i.e., using independent samples at each iteration) in either the algorithm or its analysis. To the best of our knowledge, our result is the first one that provides exact recovery guarantee for many standard algorithms such as gradient descent, SGD and block coordinate gradient descent.
引用
收藏
页码:270 / 289
页数:20
相关论文
共 50 条
  • [1] Guaranteed Matrix Completion via Non-Convex Factorization
    Sun, Ruoyu
    Luo, Zhi-Quan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (11) : 6535 - 6579
  • [2] Exact matrix completion via smooth matrix factorization
    Luo, Xiaohu
    Zhang, Zili
    Wang, Wendong
    Wang, Jianjun
    JOURNAL OF ELECTRONIC IMAGING, 2024, 33 (05)
  • [3] Tensor completion via nonconvex tensor ring rank minimization with guaranteed convergence
    Ding, Meng
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ma, Tian-Hui
    SIGNAL PROCESSING, 2022, 194
  • [4] Tensor completion via nonconvex tensor ring rank minimization with guaranteed convergence
    Ding, Meng
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ma, Tian-Hui
    Signal Processing, 2022, 194
  • [5] Matrix Completion in the Unit Hypercube via Structured Matrix Factorization
    Bugliarello, Emanuele
    Jain, Swayambhoo
    Rakesh, Vineeth
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 2038 - 2044
  • [6] Differentially Private Matrix Completion via Distributed Matrix Factorization
    Zhou, Haotian
    Liu, Xiao-Yang
    Fu, Cai
    Shang, Chen
    Chang, Xinyi
    2018 17TH IEEE INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (IEEE TRUSTCOM) / 12TH IEEE INTERNATIONAL CONFERENCE ON BIG DATA SCIENCE AND ENGINEERING (IEEE BIGDATASE), 2018, : 1628 - 1631
  • [7] Binary Matrix Factorization and Completion via Integer Programming
    Gunluk, Oktay
    Hauser, Raphael Andreas
    Kovacs, Reka Agnes
    MATHEMATICS OF OPERATIONS RESEARCH, 2024, 49 (02) : 1278 - 1302
  • [8] Matrix completion by deep matrix factorization
    Fan, Jicong
    Cheng, Jieyu
    NEURAL NETWORKS, 2018, 98 : 34 - 41
  • [9] Boolean Matrix Factorization and Noisy Completion via Message Passing
    Ravanbakhsh, Siamak
    Poczos, Barnabas
    Greiner, Russell
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [10] Binary Matrix Completion With Nonconvex Regularizers
    Liu, Chunsheng
    Shan, Hong
    IEEE ACCESS, 2019, 7 : 65415 - 65426