Yasaman Bahri – Tackling the Complexity of Modern Machine Learning

Monday, March 28
1:00 pm - 2:00 pm

Zoom

Abstract:
Deep neural networks are a rich class of function approximators now ubiquitous in modern machine learning systems, but our understanding of them is not fully developed. Can we quantitatively characterize how they function? In this talk, I describe my research in building foundations for deep learning, some of which draws inspiration from theoretical physics. I will present three threads from my recent work. First, I will describe exact connections between deep neural networks, in the limit of infinitely-wide hidden layers, and Gaussian processes and associated kernels. Second, I will discuss an equivalence between deep neural networks and linear models and characterize a nonlinear regime where the equivalence breaks. Third, I will discuss scaling trends for the performance of supervised deep learning in practice. I highlight some of the perspectives gained along the way and future challenges. Finally, I will conclude by discussing my broader research interests and opportunities at the intersection of machine learning and physical science. 


Bio:
Yasaman Bahri is a Research Scientist at Google Brain. Her research interests are in the intersection of machine learning and physical science. Prior to joining Google Brain, she completed her Ph.D. in Physics (2017) at UC Berkeley, specializing in quantum condensed matter theory, where her doctoral thesis proposed and investigated new phases of quantum matter. Her undergraduate studies were also at Berkeley, where she received B.A. degrees in Physics and Mathematics with highest honors and was winner of the Departmental Citation Award. She is a recipient of the NSF Graduate Fellowship and the Rising Stars Award in EECS.

Details

  • Date: Monday, March 28
  • Time: 1:00 pm - 2:00 pm
  • Category:
  • Location: Zoom

Host