Foundations & Theory

What Is an Oscillation, Really? Topology, Compression, and the Circle at the Heart of Brain Rhythms

We propose an algorithmic-information definition of oscillation: a signal oscillates when a circle-based model compresses it better than any simpler alternative. Here's the deep reason why.

From Appendix I of Rosetta Stone of Neural Mass Models  |  arXiv:2512.10982
GR
Giulio Ruffini & Francesca Castaldo
BCOM & Neuroelectrics
9 min read

Everyone Talks About Oscillations. But What Are They?

The word oscillation is used constantly across neuroscience, physics, engineering, and signal processing—yet each community defines it differently. Classical physicists think of repetitive, periodic motion around equilibrium. Mathematicians insist on limit cycles with stable Floquet multipliers. Neuroscientists see "rhythmic or repetitive neural activity" at all scales. Signal processing people detect narrow-band spectral peaks above the $1/f$ background.

These are all valid perspectives, but they don't talk to each other very well. In our Rosetta Stone paper, we wanted to find a deeper, unifying principle beneath all these definitions. What we arrived at, in Appendix I, surprised us with its simplicity: an oscillation is present when the data are better described starting from the baseline of circular motion than by any simpler alternative.

This is an idea rooted in topology, Koopman operator theory, and algorithmic information theory (AIT). Let us walk you through it.

How Different Fields Define Oscillations

The Same Phenomenon, Four Languages

Field Typical Definition Key Nuance
Classical Physics "Repetitive or periodic variation about a central value" Emphasizes small deviations from equilibrium, strictly periodic
Nonlinear Dynamics "A periodic orbit (limit cycle) satisfying $x(t+T) = x(t)$" Formal, coordinate-free; includes self-sustained oscillators
Neuroscience "Rhythmic or repetitive neural activity at all levels of the CNS" Multi-scale; amplitude indexes population synchrony
Signal Processing "A narrow-band peak above the aperiodic $1/f$ background" No explicit periodicity required; robust for noisy data

Each of these captures something real, but none alone is fully satisfying. The physics definition is too narrow (it excludes noisy brain rhythms). The math definition is too abstract (it says nothing about observation or measurement). The neuroscience definition is too vague. And the spectral definition is purely operational—it detects oscillations without saying what makes them oscillatory in the first place.

We wanted a definition that bridges all four and gets at the root cause: why does an oscillation show up in data at all?

It All Comes Down to a Circle

Here is the key geometric insight: topologically, every limit cycle is a circle, $S^1$. Whether you're looking at a Van der Pol oscillator, a FitzHugh–Nagumo relaxation oscillator, or a cortical alpha rhythm, the attractor is a closed loop in phase space—and a closed loop is topologically equivalent to the circle group $U(1)$.

This is not just an analogy. The circle group $U(1)$ is the group of rotations in the plane, characterized by the single symmetry of phase-shift invariance: if $z(t)$ is a solution, so is $e^{i\phi}\,z(t)$ for any constant phase offset $\phi$. This symmetry is the defining fingerprint of oscillatory dynamics.

The Simplest Oscillator — Pure Rotation on $S^1$
$$\dot{\theta} = \omega \qquad \Longleftrightarrow \qquad \dot{z} = i\omega\, z$$
Phase advances uniformly; the trajectory lives on a circle. In the complex plane, the solution $z(t) = z_0\,e^{i\omega t}$ is just uniform rotation. This is the "Platonic" oscillation—the irreducible core of every brain rhythm.

Real oscillations are of course more complicated than pure circular motion: they have amplitude modulation, noise, burst gaps, nonlinear waveform distortions. But all of these can be viewed as corrections to the underlying circular template. And this is exactly what brings us to compression.

Oscillation as Compression: The Koopman Eigenfunction

The Koopman operator provides a beautiful bridge between nonlinear dynamics and the circle. Given a dynamical system $\dot{\mathbf{x}} = \mathbf{f}(\mathbf{x})$, the Koopman operator $\mathcal{K}^t$ acts not on states but on observables—functions $g(\mathbf{x})$ of the state. Even though $\mathbf{f}$ may be highly nonlinear, $\mathcal{K}^t$ is always a linear operator:

Koopman Operator
$$(\mathcal{K}^t g)(\mathbf{x}_0) = g(\mathbf{x}(t)), \qquad \text{where } \mathbf{x}(t) = \Phi^t(\mathbf{x}_0)$$
$\mathcal{K}^t$ advances any observable $g$ along the system's trajectories. It is linear and infinite-dimensional, even when the underlying dynamics are nonlinear and finite-dimensional.

For a limit cycle, the Koopman operator possesses a special eigenpair $(\psi, \lambda = i\omega)$ where $\psi$ is the Koopman eigenfunction. This eigenfunction is a "magic" coordinate transformation: while the full state $\mathbf{x}(t)$ traces a complicated orbit in $n$-dimensional phase space, the scalar observable $\psi(\mathbf{x}(t))$ simply rotates in the complex plane at constant frequency:

The Koopman Eigenfunction — Oscillation as Simple Rotation
$$\psi(\mathbf{x}(t)) = e^{i\omega t}\,\psi(\mathbf{x}_0)$$
The phase is $\theta(t) = \arg\psi(\mathbf{x}(t)) = \theta_0 + \omega\,t$ and the amplitude is $r(t) = |\psi(\mathbf{x}(t))|$. Finding $\psi$ compresses the entire $n$-dimensional nonlinear trajectory into a single complex number rotating on $S^1$. The level sets of $\theta = \arg\psi(\mathbf{x})$ are the system's isochrons.

This is the crucial conceptual leap: finding the Koopman eigenfunction is itself a compression. Once you know $\psi$, you can encode the full trajectory with just an initial phase $\theta_0$, a frequency $\omega$, and a small update map for the amplitude. The geometry of the circle turns nonlinear dynamics into a one-line program: output = r(t) · cos(θ(t)).

Key Insight

The Koopman eigenfunction acts as a "magic" coordinate transformation that replaces complex, nonlinear dynamics with simple rotation on the circle $S^1$. Finding this transformation is itself an act of data compression—and this is what connects dynamical-systems theory to algorithmic information theory.

An Algorithmic-Information Definition of Oscillation

Algorithmic Information Theory (AIT) measures the information content of individual objects via computation. The Kolmogorov complexity $K(x)$ of a binary string $x$ is the length of the shortest program that outputs $x$ on a universal Turing machine:

Kolmogorov Complexity
$$K_U(x) = \min_{p \in \{0,1\}^*} \big\{\,|p|\;:\;U(p) = x\,\big\}$$
The shortest program that makes a fixed universal machine $U$ output $x$ and halt. By the invariance theorem, $K(x)$ depends on the choice of $U$ only up to an additive constant.

Now consider a measured signal $x_{1:N}$. We can always encode it with a generic lossless code (LZ-77, Huffman) of length $L_{\text{raw}}$. But suppose we can also encode it as: (i) a periodic template $u_{1:T}$ with some frequency (a $U(1)$ limit-cycle model, costing $K_{\text{LC}}$ bits), plus (ii) a residual code $e$ for the modulation, burst gaps, and noise (costing $K_{\text{noise}}$ bits). If

The Compression Criterion for Oscillation
$$K_{\text{LC}} + K_{\text{noise}} \;\ll\; L_{\text{raw}}$$
If the circle-based model plus residual is substantially shorter than the unstructured description, the signal contains an oscillation. No reference model is required—the gain is measured against the raw data itself.

This gives us a precise, model-independent criterion: a signal oscillates when a periodic-template model compresses the data significantly better than any model that lacks periodic structure. The periodic template corresponds to one traversal of the Koopman phase, and the compression gain comes from exploiting the $S^1$ symmetry.

Formal Definition

A dataset is said to represent an oscillation when it can be most succinctly Lie-generated from a representation of $U(1)$ (plus noise).

In other words, oscillation = compressibility via the circle. An algorithmic agent would declare: "An oscillation is a detected pattern: a signal that approximately repeats." The generative model is $\text{data} = M(\theta) + \text{noise}$, where $\theta \in S^1$ is the latent circular coordinate.

Why the Stuart–Landau Equation Is Inevitable

One of the most satisfying results in the appendix is showing that the Stuart–Landau equation isn't just a convenient model—it's topologically inevitable for any system near a Hopf bifurcation.

We start with the harmonic oscillator $\dot{z} = i\omega z$ and add a generic smooth perturbation:

Perturbed Oscillator
$$\dot{z} = (\mu + i\omega)\,z + F(z, \bar{z})$$
$F$ contains all higher-order nonlinear terms. The question: which terms survive when we systematically simplify via coordinate changes?

Using normal-form reduction (center manifold reduction followed by near-identity coordinate transformations), we attempt to eliminate as many nonlinear terms as possible. The key tool is the Lie derivative associated with the circular rotation:

The Lie Derivative — Generator of Rotations
$$\mathcal{L}_0 = \omega\,\frac{\partial}{\partial\theta}$$
This operator encodes how functions change as we rotate around the limit cycle. To eliminate a nonlinear term $F$, we need to solve $\mathcal{L}_0 H = F - N$ for the coordinate change $H$. Terms in the kernel of $\mathcal{L}_0$ cannot be eliminated.

The terms that resist elimination—the resonant terms—are precisely those whose angular frequency matches the natural rotation frequency $\omega$. Geometrically, they correspond to closed but non-exact 1-forms on $S^1$, the hallmark of nontrivial topology. At third order, the only such term is $|z|^2 z$, which is $U(1)$-covariant. The result is the Stuart–Landau equation:

Stuart–Landau — The Topologically Inevitable Normal Form
$$\dot{z} = (\mu + i\omega)\,z - (g + i\beta)\,|z|^2\,z$$
$U(1)$ symmetry was never assumed—it emerges from the reduction. The circle topology of the limit cycle forces this form. At every odd order, only terms of the form $|z|^{2m}z$ survive, giving the general normal form $\dot{z} = (\mu + i\omega)z + z\,g(|z|^2)$.

This is a deep result: we didn't assume the system was $U(1)$-symmetric. We simply started with a generic perturbation of a linear oscillator and found that the circle's topology forces the Stuart–Landau form. The cohomological structure of $S^1$ determines which nonlinear terms can and cannot be eliminated.

Cohomological Origin of Resonant Terms
$$H^1(\mathcal{L}_0) = \frac{\ker \mathcal{L}_0}{\operatorname{im} \mathcal{L}_0} \;\cong\; H^1_{\text{dR}}(S^1) \;\cong\; \mathbb{R}$$
The first cohomology group of the circle is one-dimensional. This is why exactly one family of resonant terms survives at each order—the $|z|^{2m}z$ terms. They are genuine topological obstructions that no smooth coordinate change can remove.

Three Lenses, One Circle

Dynamical Systems Lens

A limit cycle is a closed orbit in phase space—topologically a circle $S^1$. The Koopman eigenfunction maps the full state to uniform rotation on this circle, compressing nonlinear dynamics to phase + amplitude.

Spectral / Signal Lens

An oscillation is a narrow-band peak above the $1/f$ background. This peak is the Fourier signature of the $S^1$ symmetry: a periodic template that compresses the data relative to broadband noise.

Information-Theoretic Lens

A signal oscillates when a $U(1)$-based generative model (circle + corrections) achieves shorter description length than any model without periodic structure. Oscillation = compressibility via $S^1$.

Topological Lens

The nontrivial first cohomology $H^1(S^1) \cong \mathbb{R}$ ensures that the Stuart–Landau normal form is universal: the resonant terms are topological invariants that no coordinate change can erase.

These four perspectives are not competing definitions—they are four windows onto the same mathematical object: the circle $S^1$ and its symmetry group $U(1)$. Every brain oscillation, from the fastest gamma ripple to the slowest infra-slow fluctuation, derives its oscillatory character from this same topological core.

Why This Matters for Neuroscience

A Principled Detection Criterion

The compression definition gives us a model-independent way to ask "does this signal oscillate?" that works equally well for clean sinusoids, noisy cortical rhythms, and bursty intermittent oscillations. If a circle-based model compresses the data, oscillation is present. If it doesn't, it isn't—regardless of what a spectral peak might suggest.

Why All Neural Mass Models Share the Same Core

The topological inevitability of the Stuart–Landau form explains a central theme of the Rosetta Stone paper: all neural mass models, from Kuramoto to NMM2, share a push–pull structure because they are all organized around the same circle topology. The push–pull motif is the $U(1)$ symmetry expressed in excitatory–inhibitory coordinates.

Connections to Information Theory and Consciousness

The compression perspective connects neural oscillations to broader questions in computational neuroscience: oscillatory dynamics create compressible structure in neural data, which may be exploited by the brain itself for efficient coding, prediction, and communication between regions.

The Bottom Line

An oscillation is not just "a thing that repeats." It is a signal whose structure is best explained by circular motion on $S^1$—the simplest nontrivial topology. This single geometric fact unifies dynamical-systems theory, spectral analysis, Koopman operator theory, and algorithmic information theory into one coherent picture. The circle is not just a convenience; it is the topological reason brain rhythms exist at all.

Castaldo, F., de Palma Aristides, R., Clusella, P., Garcia-Ojalvo, J., & Ruffini, G. (2025). Rosetta Stone of Neural Mass Models — Appendix I: Oscillations, Topology and Simplicity. arXiv:2512.10982. https://arxiv.org/abs/2512.10982