Echo State Networks

Echo State Networks (ESNs) are a type of recurrent neural network, whose training consists of solving a linear system and does not require gradient descent [1,2]. ESNs have been shown to time-accurately predict dynamical systems with similar accuracy to other recurrent neural networks, such as LSTMs and GRUs [3].

Training ESNs is straightforward, but their performance is highly sensitive to the selection of hyperparameters. In this tutorial, we provide a self-containted implementation of echo state networks with

We test the networks on the chaotic Lorenz system, and show that a network trained in 30 seconds can predict the system for more than 10 Lyapunov times (the Lyapunov time is the inverse of the largest Lyapunov exponent).

Lyapunov spectrum and Kaplan-Yorke dimension

In chaotic systems, the dynamics are predictable only for finite times. This happens because infinitesimal errors in the prediction of the system, $\epsilon\mathbf{y}$, increase in time with an average exponential rate given by the (positive) largest Lyapunov exponent of the system.

The dynamics of infinitesimal errors are characterized by the entire set of Lyapunov exponents, the Lyapunov spectrum. Through the Lyapunov spectrum, the dimensionality of the solution, which is typically significantly smaller than the number of degrees of freedom of the system, can be estimated by the Kaplan-Yorke conjecture [5].

In this tutorial, we provide a Jacobian-free algorithm to compute the n largest Lyapunov exponents and the Kaplan-Yorke dimension of dynamical systems.

References