Deep physical neural networks: training physical systems like neural networks

Thursday, March 10
11:00 am - 12:00 pm

Grier A (34-401A)

Logan Wright

Deep learning has proven to be a remarkably versatile and scalable technique for learning algorithms to process and interact with noisy, high-dimensional real-world data and systems. In deep learning, the backpropagation algorithm is used to adjust the parameters of a multi-layer (deep) neural network so that the network “learns” to perform desired mathematical functions. Here, I will discuss my work to adapt this procedure to train networks of controllable physical systems – physical neural networks – which directly learn physical functions, such as performing machine learning inference calculations [1]. I will present proof-of-concept PNNs we have constructed to perform image and audio classification, based on ultrafast nonlinear photonics, bulk analog electronics, and mechanics. Because PNNs learn physical transformations directly at the level of the hardware physics, without relying on predefined mathematical isomorphisms, they may harness noisy, analog physical processes for computation more efficiently and opportunistically than traditional approaches. More broadly, PNNs form the basis for a learning-based approach to the design and programming of physical devices, which may perform complex physical functions in non-digital domains, e.g. for sensing.

Logan G. Wright is a postdoctoral research scientist at Cornell University and NTT Research. His research focuses on physical information processing, primarily with nonlinear and quantum optical systems. He received his PhD in 2018 from Cornell University, where he studied ultrafast laser systems and multimode nonlinear optical waves. He is the recipient of several awards, including the Tingye Li Innovation Prize in 2018, and the Cornell William Nichols Findley award in 2015 and 2018. 

Host: Marc Baldo


  • Date: Thursday, March 10
  • Time: 11:00 am - 12:00 pm
  • Category:
  • Location: Grier A (34-401A)