The Power of Convex Relaxation: Near-Optimal Matrix Completion

被引:1294
|
作者
Candes, Emmanuel J. [1 ]
Tao, Terence [2 ]
机构
[1] CALTECH, Dept Appl & Computat Math, Pasadena, CA 91125 USA
[2] Univ Calif Los Angeles, Dept Math, Los Angeles, CA 90095 USA
关键词
Duality in optimization; free probability; low-rank matrices; matrix completion; nuclear norm minimization; random matrices and techniques from random matrix theory; semidefinite programming; INEQUALITIES; INFORMATION;
D O I
10.1109/TIT.2010.2044061
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper is concerned with the problem of recovering an unknown matrix from a small fraction of its entries. This is known as the matrix completion problem, and comes up in a great number of applications, including the famous Netflix Prize and other similar questions in collaborative filtering. In general, accurate recovery of a matrix from a small number of entries is impossible, but the knowledge that the unknown matrix has low rank radically changes this premise, making the search for solutions meaningful. This paper presents optimality results quantifying the minimum number of entries needed to recover a matrix of rank r exactly by any method whatsoever (the information theoretic limit). More importantly, the paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). This convex program simply finds, among all matrices consistent with the observed entries, that with minimum nuclear norm. As an example, we show that on the order of nr log(n) samples are needed to recover a random n x n matrix of rank by any method, and to be sure, nuclear norm minimization succeeds as soon as the number of entries is of the form nr log(n).
引用
收藏
页码:2053 / 2080
页数:28
相关论文
共 50 条
  • [1] Near-Optimal Weighted Matrix Completion
    Lopez, Oscar
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [2] Near-optimal sample complexity for convex tensor completion
    Ghadermarzy, Navid
    Plan, Yaniv
    Yilmaz, Ozgur
    [J]. INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2019, 8 (03) : 577 - 619
  • [3] Near-Optimal Joint Object Matching via Convex Relaxation
    Chen, Yuxin
    Guibas, Leonidas J.
    Huang, Qi-Xing
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 100 - 108
  • [4] Optimal Tuning-Free Convex Relaxation for Noisy Matrix Completion
    Yang, Yuepeng
    Ma, Cong
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (10) : 6571 - 6585
  • [5] The power of convex relaxation: the surprising stories of matrix completion and compressed sensing
    Candes, Emmanuel J.
    [J]. PROCEEDINGS OF THE TWENTY-FIRST ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2010, 135 : 1321 - 1321
  • [6] Matrix Completion With Column Manipulation: Near-Optimal Sample-Robustness-Rank Tradeoffs
    Chen, Yudong
    Xu, Huan
    Caramanis, Constantine
    Sanghavi, Sujay
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (01) : 503 - 526
  • [7] Near-optimal method for highly smooth convex optimization
    Bubeck, Sebastien
    Jiang, Qijia
    Lee, Yin Tat
    Li, Yuanzhi
    Sidford, Aaron
    [J]. CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [8] NEAR-OPTIMAL ALGORITHMS FOR ONLINE MATRIX PREDICTION
    Hazan, Elad
    Kale, Satyen
    Shalev-Shwartz, Shai
    [J]. SIAM JOURNAL ON COMPUTING, 2017, 46 (02) : 744 - 773
  • [9] Near-Optimal Asymmetric Binary Matrix Partitions
    Fidaa Abed
    Ioannis Caragiannis
    Alexandros A. Voudouris
    [J]. Algorithmica, 2018, 80 : 48 - 72
  • [10] Near-Optimal Asymmetric Binary Matrix Partitions
    Abed, Fidaa
    Caragiannis, Ioannis
    Voudouris, Alexandros A.
    [J]. ALGORITHMICA, 2018, 80 (01) : 48 - 72