Inference Using Alternating Minimization

SHARE:

Event Speaker: 

Sujay Sanghavi (U. Texas, Austin)

Event Location: 

32-141

Event Date/Time: 

Tuesday, October 8, 2013 - 4:00pm

Reception to follow.
 
Abstract:
Alternating minimization refers to the popular strategy of solving a non-convex problem by partitioning the variables into two sets, and then alternating between holding one fixed and optimizing over the other. In many large-scale inference settings, AltMin represents the computationally most efficient method to implement; typically, it comes with no performance guarantees.

In this talk we provide the first rigorous performance guarantees for AltMin, in three non-convex inference problems where it is widely used in practice: phase retrieval, mixed linear regression, and low-rank matrix completion. Our analysis is built on two crucial modifications - a smart initialization, and re-sampling - which allow for global linear convergence of the AltMin estimate using samples that are at most log factors away from lower bounds.

Related papers in STOC 2013 and NIPS 2013, also available on Arxiv.

Biography:
Sujay Sanghavi is an Assistant Professor in Electrical and Computer Engineering at UT Austin, which he joined in 2009. He has a B.Tech in EE from IIT Bombay, an MS and PHD in ECE, and an MS in Math, all from UIUC. Sujay was a postdoc in LIDS at MIT. Sujay’s research focus lies at the intersection of networks, machine learning and signal processing. Sujay is a recipient of the NSF CAREER award, and of a Young Investigator award from DTRA.