Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

被引:0
|
作者
Mu, Cun [1 ]
Huang, Bo [1 ]
Wright, John [2 ]
Goldfarb, Donald [1 ]
机构
[1] Columbia Univ, Dept Ind Engn & Operat Res, New York, NY 10027 USA
[2] Columbia Univ, Dept Elect Engn, New York, NY 10027 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way nxnx...xn tensor of Tucker rank (r, r, ..., r) from Gaussian measurements requires Omega (rn(k-1)) observations. In contrast, a certain (intractable) nonconvex formulation needs only O (r(+)(k ) nrK) observations. We introduce a simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(r(left perpendicularK/2right perpendicular)n (inverted right perpendicularK/2inverted left perpendicular)) observations. The lower bound for the SNN model follows from our new result on recovering signals with multiple structures (e.g. sparse, low rank), which indicates the significant suboptimality of the common approach of minimizing the sum of individual sparsity inducing norms (e.g. l(1), nuclear norm). Our new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.
引用
收藏
页码:73 / 81
页数:9
相关论文
共 50 条