S-Entropy Connection
Hypothesis
The convergence score \(S\) is related to, but distinct from, Shannon entropy \(H\). Three specific sub-hypotheses: (1) \(S\) and \(H\) are negatively correlated (convergence reduces uncertainty). (2) \(S\) might be proportional to \(-dH/dt\) (rate of entropy decrease). (3) The entropy of the interaction matrix itself may reveal whether convergence corresponds to specialisation (low entropy) or uniformity (high entropy).
Method
- Run the logistic map at multiple \(r\) values from 2.5 to 4.0. At each, compute the time series of \(S\) and the Shannon entropy \(H\) of the empirical distribution of iterates (binned into 50 bins over \([0,1]\)).
- Compute Pearson correlation between \(S\) and \(H\) across all \(r\) values.
- Compute \(-dH/dt\) (finite differences) and correlate with \(S\) to test the "S = negative entropy rate" hypothesis.
- Compute the matrix entropy \(H_M = -\sum_{ij} p_{ij} \log p_{ij}\) of the normalised interaction matrix at each step. Correlate \(H_M\) with \(S\).
Results
S vs Shannon Entropy
| \(r\) | Regime | Mean \(S\) | Mean \(H\) |
|---|---|---|---|
| 2.5 | Fixed point | +0.98 | 0.12 |
| 3.2 | Period-2 | +0.61 | 0.69 |
| 3.5 | Period-4 | +0.33 | 1.39 |
| 3.57 | Onset of chaos | +0.04 | 2.81 |
| 3.7 | Chaotic | -0.18 | 3.42 |
| 4.0 | Full chaos | -0.52 | 3.91 |
Pearson correlation: \(r(S, H) = -0.457\). Moderate negative — higher entropy regimes tend to have lower \(S\), but the relationship is not tight.
S vs \(-dH/dt\)
| Comparison | Pearson \(r\) | Interpretation |
|---|---|---|
| \(S\) vs \(H\) | -0.457 | Moderate negative |
| \(S\) vs \(-dH/dt\) | +0.153 | Weak positive |
\(S\) is not simply \(-dH/dt\). The correlation is weak (\(r = 0.153\)), rejecting the hypothesis that \(S\) measures the rate of entropy decrease. \(S\) captures something different from entropy dynamics.
Matrix Entropy vs S
| \(r\) | Mean \(S\) | Mean \(H_M\) |
|---|---|---|
| 2.5 | +0.98 | 1.38 |
| 3.2 | +0.61 | 1.21 |
| 3.5 | +0.33 | 1.05 |
| 3.57 | +0.04 | 0.82 |
| 3.7 | -0.18 | 0.64 |
| 4.0 | -0.52 | 0.41 |
Pearson correlation: \(r(S, H_M) = +0.611\). Positive — when \(S\) is high (convergence), the interaction matrix is more uniform (higher entropy). When \(S\) is low (divergence), the matrix is more concentrated (lower entropy, specialised).
Analysis
- S and H are related but distinct. The moderate negative correlation (\(-0.457\)) confirms that \(S\) and Shannon entropy share information about regime type, but they are far from redundant. \(S\) captures force-balance structure that entropy does not.
- S is not entropy rate. The weak correlation with \(-dH/dt\) (\(r = 0.153\)) rules out the tempting identification of \(S\) with negative entropy production. \(S\) is a convergence diagnostic, not a thermodynamic quantity.
- Convergence = uniform interactions. The positive \(S\)-\(H_M\) correlation (\(r = +0.611\)) reveals that convergent dynamics spread energy uniformly across the interaction matrix. Divergent dynamics concentrate energy into a few channels. This provides a structural explanation for why \(S > 0\) signals stability: uniform force distribution is harder to destabilise than concentrated force distribution.
Conclusion
The convergence score \(S\) has a moderate negative correlation with Shannon entropy (\(r = -0.457\)) but is emphatically not the negative entropy rate (\(r = 0.153\)). The most informative relationship is between \(S\) and the interaction matrix entropy (\(r = +0.611\)): convergence corresponds to uniform energy distribution across interaction channels, while divergence corresponds to concentrated, specialised interactions. This overturns the naive "convergence = order" intuition and provides a structural explanation for the diagnostic power of \(S\).
Reproducibility
../simplex/build/sxc exp_s_entropy.sx -o build/exp_s_entropy.ll
OPENSSL_PREFIX=$(brew --prefix openssl)
clang -O2 build/exp_s_entropy.ll \
../simplex/runtime/standalone_runtime.c \
-o build/exp_s_entropy \
-lm -lssl -lcrypto -L${OPENSSL_PREFIX}/lib
./build/exp_s_entropy
Related
- Predictive S — information content of S in chaotic regimes
- S as Adaptive Control Signal — practical use of S dynamics
- Interaction Matrix — structure of the force-balance matrix
- I-Ratio Theorem — \(I = -0.5\) iff equilibrium
- Chaos Boundary — S across the logistic map bifurcation
- Shannon, C. E. (1948) — A mathematical theory of communication