Category: Machine Learning
Summary: Sweeping learning rates to test whether neural network training shows thermodynamic-style phase changes.
This experiment treats neural network optimization as a statistical-physics system with analogs of temperature, specific heat, order parameters, and phase transitions. The open question is whether these thermodynamic summaries reveal genuine regime changes in training dynamics rather than just colorful metaphors.
Version 2 improves the original setup by using a more structured classification task, a learning-rate sweep, and cross-learning-rate analysis. It measures gradient-noise scale as an effective temperature, weight-matrix rank as an order parameter, Hessian sharpness as a curvature statistic, and specific heat through finite differences across learning rates. Binder-cumulant style checks are then used to search for sharper evidence of transitions.
The value of the experiment is not merely to log training curves, but to test whether multiple physics-inspired observables line up on the same transition region. That makes it a more serious probe of the thermodynamics analogy than a single-network anecdote.
Method: Numpy neural-network training with a learning-rate sweep, Hessian power iteration, rank measurements, and cross-LR transition analysis.
What is measured: Gradient-noise temperature, specific heat, effective rank, Hessian top eigenvalue, Binder-style crossing signals, learning-rate sweep summaries, and final accuracy.
