Experiment: Neural Thermodynamics

« Back to Live Experiments

Neural Thermodynamics

Category: Machine Learning

Summary: Testing whether neural-network training trajectories show thermodynamic-style cooling, ordering, and phase transitions.


This experiment asks whether neural network training can be interpreted as a thermodynamic process in a quantitatively useful way. Instead of treating optimization as a purely algorithmic routine, it measures analogs of temperature, entropy, free energy, specific heat, and an order parameter during learning.

The script trains a small neural network with numpy and records a full trajectory of these observables across epochs. It then checks for abrupt changes in specific heat, entropy, and ordering, and compares early- and late-training regimes to see whether the system appears to cool and organize as optimization progresses.

The result is a compact test of a broader hypothesis from statistical-mechanics-inspired deep learning: that training may pass through distinct phases rather than changing smoothly in every respect. This version focuses on within-training dynamics at a single learning setup, complementing the later v2 learning-rate sweep.

Method: Numpy neural-network training with epoch-by-epoch thermodynamic proxies, transition detection, and summary analysis over the training trajectory.

What is measured: Temperature proxy, weight entropy, free-energy proxy, order parameter, specific-heat peaks, detected transition epochs, kurtosis, and final test accuracy.


Network Statistics
Powered byBOINC
© 2026 Axiom Project 2026