Experiment: Edge of Chaos v2

« Back to Live Experiments

Edge of Chaos v2

Category: Machine Learning

Summary: Mapping how spectral radius controls Lyapunov instability and memory capacity in random recurrent neural networks.


Random recurrent networks are predicted to perform best near the boundary between ordered and chaotic dynamics, often called the edge of chaos. This experiment asks where Lyapunov exponents cross zero and whether memory capacity peaks near that same transition.

The script builds tanh recurrent networks over a sweep of spectral radii, estimates Lyapunov exponents with a QR-based method, and evaluates memory with linear readouts. Repeating the calculation across independent networks turns the result into a finite ensemble map rather than a single anecdotal run.

The project matters because the edge-of-chaos idea is widely cited but sensitive to model details. Here the experiment tests the proposed link between dynamical instability and useful computation in a controlled setting.

Method: Random recurrent-network simulations sweeping spectral radius, with Lyapunov-exponent estimation and memory-capacity regression across repeated trials.

What is measured: Largest Lyapunov exponent, memory capacity, dependence on spectral radius, and variability across random network draws.


Network Statistics
Powered byBOINC
© 2026 Axiom Project 2026