CS Events Monthly View


Implicit Regularization for Optimal Sparse Recovery


Download as iCal file

Wednesday, October 02, 2019, 09:45am



We present an implicit regularization scheme for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption. For a given parameterization yielding a non-convex optimization problem, we show that prescribed choices of initialization, step size and stopping time yield a statistically and computationally optimal algorithm that achieves the minimax rate with the same cost required to read the data up to poly-logarithmic factors. Beyond minimax optimality, we show that our algorithm adapts to
instance difficulty and yields a dimension-independent rate when the signal-to-noise ratio is high enough. We validate our findings with numerical experiments and compare our algorithm against explicit $ell_{1}$ penalization. Going from hard instances to easy ones, our algorithm is seen to undergo a phase transition, eventually matching least squares with an oracle knowledge of the true support.

Based on joint work with Patrick Rebeschini and Tomas Vaskevicius.

Speaker: Varun Kanade

Location : CoRE A 301


Rutgers/DiMACS Theory of Computing Seminar

Event Type: Seminar