The question
Everyone Talks About Oscillations. But What Are They?
The word oscillation is used constantly across neuroscience, physics, engineering, and signal processing—yet each community defines it differently. Classical physicists think of repetitive, periodic motion around equilibrium. Mathematicians insist on limit cycles with stable Floquet multipliers. Neuroscientists see "rhythmic or repetitive neural activity" at all scales. Signal processing people detect narrow-band spectral peaks above the $1/f$ background.
These are all valid perspectives, but they don't talk to each other very well. In our Rosetta Stone paper, we wanted to find a deeper, unifying principle beneath all these definitions. What we arrived at, in Appendix I, surprised us with its simplicity: an oscillation is present when the data are better described starting from the baseline of circular motion than by any simpler alternative.
This is an idea rooted in topology, Koopman operator theory, and algorithmic information theory (AIT). Let us walk you through it.
The landscape
How Different Fields Define Oscillations
The Same Phenomenon, Four Languages
| Field | Typical Definition | Key Nuance |
|---|---|---|
| Classical Physics | "Repetitive or periodic variation about a central value" | Emphasizes small deviations from equilibrium, strictly periodic |
| Nonlinear Dynamics | "A periodic orbit (limit cycle) satisfying $x(t+T) = x(t)$" | Formal, coordinate-free; includes self-sustained oscillators |
| Neuroscience | "Rhythmic or repetitive neural activity at all levels of the CNS" | Multi-scale; amplitude indexes population synchrony |
| Signal Processing | "A narrow-band peak above the aperiodic $1/f$ background" | No explicit periodicity required; robust for noisy data |
Each of these captures something real, but none alone is fully satisfying. The physics definition is too narrow (it excludes noisy brain rhythms). The math definition is too abstract (it says nothing about observation or measurement). The neuroscience definition is too vague. And the spectral definition is purely operational—it detects oscillations without saying what makes them oscillatory in the first place.
We wanted a definition that bridges all four and gets at the root cause: why does an oscillation show up in data at all?
The geometry
It All Comes Down to a Circle
Here is the key geometric insight: topologically, every limit cycle is a circle, $S^1$. Whether you're looking at a Van der Pol oscillator, a FitzHugh–Nagumo relaxation oscillator, or a cortical alpha rhythm, the attractor is a closed loop in phase space—and a closed loop is topologically equivalent to the circle group $U(1)$.
This is not just an analogy. The circle group $U(1)$ is the group of rotations in the plane, characterized by the single symmetry of phase-shift invariance: if $z(t)$ is a solution, so is $e^{i\phi}\,z(t)$ for any constant phase offset $\phi$. This symmetry is the defining fingerprint of oscillatory dynamics.
Real oscillations are of course more complicated than pure circular motion: they have amplitude modulation, noise, burst gaps, nonlinear waveform distortions. But all of these can be viewed as corrections to the underlying circular template. And this is exactly what brings us to compression.
The Koopman view
Oscillation as Compression: The Koopman Eigenfunction
The Koopman operator provides a beautiful bridge between nonlinear dynamics and the circle. Given a dynamical system $\dot{\mathbf{x}} = \mathbf{f}(\mathbf{x})$, the Koopman operator $\mathcal{K}^t$ acts not on states but on observables—functions $g(\mathbf{x})$ of the state. Even though $\mathbf{f}$ may be highly nonlinear, $\mathcal{K}^t$ is always a linear operator:
For a limit cycle, the Koopman operator possesses a special eigenpair $(\psi, \lambda = i\omega)$ where $\psi$ is the Koopman eigenfunction. This eigenfunction is a "magic" coordinate transformation: while the full state $\mathbf{x}(t)$ traces a complicated orbit in $n$-dimensional phase space, the scalar observable $\psi(\mathbf{x}(t))$ simply rotates in the complex plane at constant frequency:
This is the crucial conceptual leap: finding the Koopman eigenfunction is itself a compression. Once you know $\psi$, you can encode the full trajectory with just an initial phase $\theta_0$, a frequency $\omega$, and a small update map for the amplitude. The geometry of the circle turns nonlinear dynamics into a one-line program: output = r(t) · cos(θ(t)).
The Koopman eigenfunction acts as a "magic" coordinate transformation that replaces complex, nonlinear dynamics with simple rotation on the circle $S^1$. Finding this transformation is itself an act of data compression—and this is what connects dynamical-systems theory to algorithmic information theory.
The definition
An Algorithmic-Information Definition of Oscillation
Algorithmic Information Theory (AIT) measures the information content of individual objects via computation. The Kolmogorov complexity $K(x)$ of a binary string $x$ is the length of the shortest program that outputs $x$ on a universal Turing machine:
Now consider a measured signal $x_{1:N}$. We can always encode it with a generic lossless code (LZ-77, Huffman) of length $L_{\text{raw}}$. But suppose we can also encode it as: (i) a periodic template $u_{1:T}$ with some frequency (a $U(1)$ limit-cycle model, costing $K_{\text{LC}}$ bits), plus (ii) a residual code $e$ for the modulation, burst gaps, and noise (costing $K_{\text{noise}}$ bits). If
This gives us a precise, model-independent criterion: a signal oscillates when a periodic-template model compresses the data significantly better than any model that lacks periodic structure. The periodic template corresponds to one traversal of the Koopman phase, and the compression gain comes from exploiting the $S^1$ symmetry.
A dataset is said to represent an oscillation when it can be most succinctly Lie-generated from a representation of $U(1)$ (plus noise).
In other words, oscillation = compressibility via the circle. An algorithmic agent would declare: "An oscillation is a detected pattern: a signal that approximately repeats." The generative model is $\text{data} = M(\theta) + \text{noise}$, where $\theta \in S^1$ is the latent circular coordinate.
The emergence
Why the Stuart–Landau Equation Is Inevitable
One of the most satisfying results in the appendix is showing that the Stuart–Landau equation isn't just a convenient model—it's topologically inevitable for any system near a Hopf bifurcation.
We start with the harmonic oscillator $\dot{z} = i\omega z$ and add a generic smooth perturbation:
Using normal-form reduction (center manifold reduction followed by near-identity coordinate transformations), we attempt to eliminate as many nonlinear terms as possible. The key tool is the Lie derivative associated with the circular rotation:
The terms that resist elimination—the resonant terms—are precisely those whose angular frequency matches the natural rotation frequency $\omega$. Geometrically, they correspond to closed but non-exact 1-forms on $S^1$, the hallmark of nontrivial topology. At third order, the only such term is $|z|^2 z$, which is $U(1)$-covariant. The result is the Stuart–Landau equation:
This is a deep result: we didn't assume the system was $U(1)$-symmetric. We simply started with a generic perturbation of a linear oscillator and found that the circle's topology forces the Stuart–Landau form. The cohomological structure of $S^1$ determines which nonlinear terms can and cannot be eliminated.
Synthesis
Three Lenses, One Circle
Dynamical Systems Lens
A limit cycle is a closed orbit in phase space—topologically a circle $S^1$. The Koopman eigenfunction maps the full state to uniform rotation on this circle, compressing nonlinear dynamics to phase + amplitude.
Spectral / Signal Lens
An oscillation is a narrow-band peak above the $1/f$ background. This peak is the Fourier signature of the $S^1$ symmetry: a periodic template that compresses the data relative to broadband noise.
Information-Theoretic Lens
A signal oscillates when a $U(1)$-based generative model (circle + corrections) achieves shorter description length than any model without periodic structure. Oscillation = compressibility via $S^1$.
Topological Lens
The nontrivial first cohomology $H^1(S^1) \cong \mathbb{R}$ ensures that the Stuart–Landau normal form is universal: the resonant terms are topological invariants that no coordinate change can erase.
These four perspectives are not competing definitions—they are four windows onto the same mathematical object: the circle $S^1$ and its symmetry group $U(1)$. Every brain oscillation, from the fastest gamma ripple to the slowest infra-slow fluctuation, derives its oscillatory character from this same topological core.
Implications
Why This Matters for Neuroscience
A Principled Detection Criterion
The compression definition gives us a model-independent way to ask "does this signal oscillate?" that works equally well for clean sinusoids, noisy cortical rhythms, and bursty intermittent oscillations. If a circle-based model compresses the data, oscillation is present. If it doesn't, it isn't—regardless of what a spectral peak might suggest.
Why All Neural Mass Models Share the Same Core
The topological inevitability of the Stuart–Landau form explains a central theme of the Rosetta Stone paper: all neural mass models, from Kuramoto to NMM2, share a push–pull structure because they are all organized around the same circle topology. The push–pull motif is the $U(1)$ symmetry expressed in excitatory–inhibitory coordinates.
Connections to Information Theory and Consciousness
The compression perspective connects neural oscillations to broader questions in computational neuroscience: oscillatory dynamics create compressible structure in neural data, which may be exploited by the brain itself for efficient coding, prediction, and communication between regions.
An oscillation is not just "a thing that repeats." It is a signal whose structure is best explained by circular motion on $S^1$—the simplest nontrivial topology. This single geometric fact unifies dynamical-systems theory, spectral analysis, Koopman operator theory, and algorithmic information theory into one coherent picture. The circle is not just a convenience; it is the topological reason brain rhythms exist at all.
Reference
Castaldo, F., de Palma Aristides, R., Clusella, P., Garcia-Ojalvo, J., & Ruffini, G. (2025). Rosetta Stone of Neural Mass Models — Appendix I: Oscillations, Topology and Simplicity. arXiv:2512.10982. https://arxiv.org/abs/2512.10982