Doctoral Thesis: Latent Variable Model Estimation via Collaborative Filtering


Event Speaker: 

Christina E. Lee

Event Location: 

32-G449 KIVA

Event Date/Time: 

Monday, July 31, 2017 - 2:00pm


Similarity based collaborative filtering for matrix completion is a popular heuristic that has been used widely across industry in the previous decades to build recommendation systems, due to its simplicity and scalability. However, despite its popularity, there has been little theoretical foundation explaining its widespread success. In this thesis, we prove theoretical guarantees for collaborative filtering under a nonparametric latent variable model, which arises from the natural property of ``exchangeability'', i.e. invariance under relabeling of the dataset. The analysis suggests that similarity based collaborative filtering can be viewed as kernel regression for latent variable models, where the features are not directly observed and the kernel must be estimated from the data. In addition, while classical collaborative filtering typically requires a dense dataset, this thesis proposes a new collaborative filtering algorithm which compares larger radius neighborhoods of data to compute similarities, and show that the estimate converges even for very sparse datasets, which has implications towards sparse graphon estimation. The algorithms can be applied in a variety of settings, such as recommendations for online markets, analysis of social networks, or denoising crowdsourced labels.

Thesis Supervisors: Devavrat Shah and Asuman Ozdaglar