Doctoral Thesis: Representation and Transfer Learning Using Information-Theoretic Approximations

SHARE:

Event Speaker: 

David Da Qiu

Event Location: 

via Zoom (details below)

Event Date/Time: 

Thursday, April 23, 2020 - 2:00pm

Abstract: 
 
Learning informative and transferable feature representations is a key aspect of machine learning systems. Mutual information and Kullback-Leibler divergence are principled and very popular metrics to measure feature relevance and perform distribution matching, respectively. However, clean formulations of machine learning algorithms based on these information-theoretic quantities typically require density estimation, which could be difficult for high dimensional problems. A central theme of this thesis is to translate these formulations into simpler forms that are more amenable to limited data. In particular, we modify local approximations and variational approximations of information-theoretic quantities to propose algorithms for unsupervised and transfer learning. Experiments show that the representations learned by our algorithms perform competitively compared to popular methods that require higher complexity.
 
Committee: 
Lizhong Zheng, Thesis Supervisor
Gregory Wornell
Guy Bresler
 
For details on attending this thesis defense, please contact the doctoral candidate
davidq at mit dot edu