Setup and the Three Conditions
Fix state space \(\mathcal{X} = [-1,1]\), sampling interval \(\Delta > 0\), and the binary coarse-graining \(\Pi(x) = R\) if \(x \ge 0\), \(L\) otherwise. For a trajectory \(x(t)\), define \(A_n = \Pi(x(n\Delta))\). Three conditions on \((x, \Pi, \Delta)\) are necessary for the symbolic sequence to support emergent computation.
(i) Decodability. \(H(A_n \mid R_n) \le \varepsilon\). The current symbol must be recoverable from the observer's record. A label that cannot be read is not a computational state.
(ii) Stability. \(\Pr(A_{n+1} \neq A_n) \le \eta\). The symbol must persist across one time step. An alphabet that churns faster than it can be used carries no causal content.
(iii) Lumpability. For every symbol \(i\), all micro-states within \(\Pi^{-1}(i)\) must have nearly the same distribution over the next symbol: \[ \lambda \;:=\; \sup_{\Pi(x)=\Pi(y)=i} \bigl|\mathcal{L}(A_{n+1} \mid X_n{=}x) - \mathcal{L}(A_{n+1} \mid X_n{=}y)\bigr|_{\mathrm{TV}} \;\le\; \lambda_{\max}. \] Knowing the macro-symbol must be enough to predict the next one. If micro-states in the same cell have different futures, the label is not carrying the information that matters.
These three conditions are not redundant. Stability says the present symbol is durable; lumpability says it is predictive; decodability says it is legible. When all three hold with parameters \((\varepsilon, \eta, \lambda)\), the symbolic trajectory lies within total variation distance \(N(\lambda + 2p_\varepsilon + \eta)\) of a genuine Markov chain over \(N\) steps — so failure of any condition directly collapses the computation epoch \(\Tc\).
The Non-Amenable Case: \(x(t) = \sin(1/t)\)
The argument \(1/t\) compresses the time axis near the origin. Zero crossings occur at \(t = 1/(n\pi)\), spacing \(\pi t_0^2\) apart near \(t_0\) — a gap that shrinks to zero as \(t_0 \to 0\). The number of crossings in any interval of length \(\Delta\) near \(t_0\) is approximately \(\Delta / (\pi t_0^2)\), which diverges.
Stability ✗
For any fixed \(\Delta > 0\), the flip probability \(\Pr(A_{n+1} \neq A_n) \to 1\) as \(t_n \to 0\). No choice of \(\Delta\) prevents this: the crossing density grows without bound faster than any fixed sampling rate can track.
Lumpability ✗
Near \(t_0 \approx 0\), two micro-states \(x = \sin(1/t_1)\) and \(y = \sin(1/t_2)\) can both sit in cell \(R\) yet have completely different next-symbol distributions — one is about to cross, the other just did. The lumpability error \(\lambda \approx 1\) for any partition boundary, since time-to-next-crossing varies enormously across micro-states within the same cell.
Decodability ✗
With stability and lumpability both failing, consecutive symbols become independent near the origin. The mutual information \(I(R_n; A_{n+1}) \to 0\): the observer's record is faithfully kept but conveys nothing about the future. Decodability collapses as a consequence, not independently.
Critically, this failure is thermodynamically irreparable. Entropy production can in general reduce \(\lambda\) via the dissipation-lumpability inequality \(\lambda \le \lambda_{\mathrm{eq}} - c\sqrt{\sigma}\) — but this requires a finite starting point. Here \(\lambda_{\mathrm{eq}} = 1\) and the root cause is not a blurry boundary but the absence of any dwell time within a cell. No non-equilibrium current can manufacture time the trajectory does not spend in a region. The computation epoch \(\Tc \approx 0\) regardless of power budget.
The Amenable Case: \(x(t) = \sin(t)\)
Replace \(1/t\) with \(t\). Zero crossings now occur at \(t = n\pi\), separated by exactly \(\pi \approx 3.14\) — a uniform gap, independent of \(t\). This single change repairs all three conditions.
Stability ✓
For \(\Delta < \pi\), each sampling interval contains at most one crossing. The flip probability is bounded uniformly: \[ \Pr(A_{n+1} \neq A_n) \;\le\; \frac{\Delta}{\pi} \;=:\; \eta(\Delta), \] and \(\eta(\Delta) \to 0\) as \(\Delta \to 0\). Stability is controllable by the observer's choice of sampling interval.
Lumpability ✓
The speed \(|\dot{x}| = |\cos t|\) is bounded, so no micro-state within a cell can be dramatically closer to the boundary than another. The spread in escape probabilities across micro-states in \(R\) is at most \(2\Delta/\pi\): \[ \lambda \;\approx\; \frac{2\Delta}{\pi}, \] which vanishes as \(\Delta \to 0\). The macro-symbol becomes increasingly predictive at finer sampling. Compare with \(\sin(1/t)\), where \(|\dot{x}| = |\cos(1/t)|/t^2\) grows without bound near \(t=0\) and \(\lambda \approx 1\) at all resolutions.
Decodability ✓
With stability and lumpability holding, the mutual information satisfies \(I(R_n; A_{n+1}) \ge \log 2 - H(\Delta/\pi)\), which approaches \(\log 2\) for small \(\Delta\). Knowing \(A_n = R\) reliably predicts \(A_{n+1} = R\). The observer's record is informative.
The computation epoch is now positive and grows as \(\Delta\) decreases. A small non-equilibrium drive \(\dot{W}\) reduces \(\lambda\) further via the dissipation-lumpability inequality \(\lambda \le \lambda_{\mathrm{eq}} - c\sqrt{\sigma}\), extending \(\Tc\) toward the critical point where \(\lambda = 0\) and the epoch diverges.
Summary
The two examples are identical in range, smoothness, and coarse-graining. Their difference is purely geometric: whether the zero-crossing density is bounded or accumulates without limit.
| Condition | \(\sin(1/t)\) | \(\sin(t)\) |
|---|---|---|
| Stability | Fails for all \(\Delta\); flip rate → 1 as \(t \to 0\) | Holds for \(\Delta < \pi\); \(\eta = \Delta/\pi\) |
| Lumpability | \(\lambda \approx 1\) at all resolutions | \(\lambda \approx 2\Delta/\pi \to 0\) as \(\Delta \to 0\) |
| Decodability | \(I(R_n; A_{n+1}) \to 0\) near origin | \(I(R_n; A_{n+1}) \approx \log 2\) for small \(\Delta\) |
| Repaired by \(\Delta\)? | No — fails at all resolutions near origin | Yes — any \(\Delta < \pi\) suffices |
| Repaired by dissipation? | No — no dwell time to sharpen | Yes — small \(\sigma\) reduces \(\lambda\) further |
Amenability to coarse-graining is not a smoothness property. It is a dwell-time property. Dissipation can sharpen a boundary; it cannot create time where none exists.
References
Shannon 1948. Claude E. Shannon. A Mathematical Theory of Communication. Bell System Technical Journal 27(3), 379-423; 27(4), 623-656. The information-theoretic background for decodability and mutual information.
Lind & Marcus 1995. Douglas Lind and Brian Marcus. An Introduction to Symbolic Dynamics and Coding. Cambridge University Press. Standard background on symbolic alphabets, coding, and symbolic trajectories.
Kemeny & Snell 1960. John G. Kemeny and J. Laurie Snell. Finite Markov Chains. Van Nostrand. Classical source for state aggregation and emergent Markov descriptions.
Buchholz 1994. Peter Buchholz. Exact and ordinary lumpability in finite Markov chains. Journal of Applied Probability 31(1), 59-75. Canonical structural reference for lumpability.
Freidlin & Wentzell. Mark I. Freidlin and Alexander D. Wentzell. Random Perturbations of Dynamical Systems. Springer. Standard reference on metastability, escape, and dwell-time phenomena in noisy dynamics.