ESE Spring Seminar – “Solving Inverse Problems with Generative Priors: From Low-rank to Diffusion Models”
April 10, 2024 at 1:30 PM - 2:30 PM
Organizer
Venue
: Generative priors are effective countermeasures to combat the curse of dimensionality, and enable efficient learning and inversion that otherwise are ill-posed, in data science. This talk begins with the classical low-rank prior, and introduces scaled gradient descent (ScaledGD), a simple iterative approach to directly recover the low-rank factors for a wide range of matrix and tensor estimation tasks. ScaledGD provably converges linearly at a constant rate independent of the condition number at near-optimal sample complexities, while maintaining the low per-iteration cost of vanilla gradient descent, even when the rank is overspecified and the initialization is random. Going beyond low rank, the talk discusses diffusion models as an expressive data prior in inverse problems, and introduces a plug-and-play posterior sampling method (Diffusion PnP) that alternatively calls two samplers, a proximal consistency sampler solely based on the forward model, and a denoising diffusion sampler solely based on the score functions of data prior. Performance guarantees and numerical examples will be demonstrated to illustrate the promise.

