CS Events Monthly View

Seminar

Recovery Guarantee of Weighted Low-Rank Approximation via Alternating Minimization

 

Download as iCal file

Wednesday, April 06, 2016, 11:00am

 

Many applications require recovering a ground truth low-rank matrix from noisy observations of the entries. In practice, this is typically formulated as a weighted low-rank approximation problem and solved using non-convex optimization heuristics such as alternating minimization. Such non-convex techniques have few guarantees. Even worse, weighted low-rank approximation is NP-hard for even the most simple case when the ground truth is a rank-1 matrix.

Under moderate assumptions on the weight matrix and the ground truth, we prove that the vanilla alternating minimization algorithm with a simple and cheap "clipping" step, which zeroes out rows with high l2 norms norms in the intermediate solutions, recovers the ground truth. In particular, we bound the spectral norm of the difference between the recovered matrix and the ground truth, by the spectral norm of the weighted noise plus an additive error term that decreases exponentially with the number of rounds of alternating minimization. This provides the first recovery guarantee for weighted low-rank approximation via alternating minimization with non-binary deterministic weights. It is a significant generalization of the results for matrix completion, the special case with binary weights, since our assumptions are similar or weaker than those made in existing works. Additionally, our proof applies for random initialization, unlike prior approaches that typically require an SVD-based initialization.

Speaker: Yingyu Liang

Bio

NULL

Location : Core A (Room 301)

Committee

Eric Allender, Pranjal Awasthi, Michael Saks and Mario Szegedy

Event Type: Seminar

Organization

Princeton University