Experiment: Eigenspectrum Dynamics

« Back to Live Experiments

Eigenspectrum Dynamics

Category: Machine Learning

Summary: Tracking how the singular-value spectrum of neural-network weight matrices changes during training and whether outlier modes align with learning and generalization transitions.


Random matrices have a characteristic bulk spectrum, while learned structure can appear as outlying directions that separate from that bulk. This experiment asks whether neural-network weight matrices begin near a Marchenko-Pastur-like random spectrum and then develop distinctive outliers, spectral gaps, and rank changes as training progresses.

The script trains a multilayer perceptron on Gaussian-cluster data and repeatedly computes singular-value and eigenvalue diagnostics for its weight matrices. It records the first emergence of outlier modes, shifts in effective rank and stable rank, and correlations between spectral-gap growth and generalization behavior.

That turns training into a spectral phase-transition study. The scientific interest is in whether abrupt changes in the weight spectrum provide a clean signature of representation learning that is more informative than accuracy curves alone.

Method: Repeated SVD and spectrum analysis of neural-network weight matrices during MLP training, with comparisons to random-matrix bulk predictions.

What is measured: Outlier-emergence epoch, effective-rank and stable-rank trajectories, spectral gaps, outlier counts, outlier variance ratio, spectral-gap versus generalization correlation, and final test accuracy.


Network Statistics
Powered byBOINC
© 2026 Axiom Project 2026