Doctoral Thesis: Understanding the Challenges of Learning Pole Locations and Motivating the Design of Echo State Networks
Haus Room (36-428)
Presenter: Alexander Morgan
Presenter’s Affiliation (CSAIL, RLE, LIDS, MTL, etc.): RLE
Thesis Supervisor(s): Lizhong Zheng
Abstract:
Reservoir computing is a framework for learning from time series data by utilizing fixed dynamical systems to form a state representation of input history. A widely adopted architecture for reservoir computing is the echo state network, which specifies this dynamical system by defining a mathematical update rule. In particular, echo state networks are a type of recurrent neural network where input and internal weights are randomly initialized and fixed during training, leading to convex and efficient learning. As a way to better understand and motivate this approach, we analyze the problem of learning system parameters in echo state networks. Our analysis reveals that in principle, it is possible to learn such parameters provided finitely many samples from an example input-output pair, but the problem of learning internal reservoir weights is generally non-convex and often ill-conditioned, leading to challenges in practice. From this, we motivate ways of leveraging domain knowledge to design reservoirs and select internal parameters, leading to the principled application of echo state networks.
Details
- Date: Wednesday, April 22
- Time: 8:30 am - 10:00 am
- Location: Haus Room (36-428)