Peter Godfrey-Smith, writing in the Institute of Art and Ideas, argues that studies on animal minds suggest consciousness is not computation. His core claim: "the physical make-up of a system" matters to consciousness, so duplicating the abstract computational organization of a brain in silicon wouldn't be enough. Consciousness depends on biology, not on mere computation.
The argument is sympathetically motivated. Anyone who has watched an octopus navigate a maze or a bee communicate a flower's location feels the pull: there is something going on in these biological systems that seems to resist being captured in a flow chart. But the argument contains a hidden assumption that, once exposed, dissolves the entire debate. It assumes we know what "computation" means—and it assumes it means something narrow.
What does "computation" actually mean?
In popular discourse, "computation" conjures images of silicon chips shuffling symbols—an abstract, substrate-independent process that has nothing to do with the wet, warm, physical world. This is the picture Godfrey-Smith is arguing against, and if this were all computation meant, he might have a point.
But mathematically, computation is something much broader and more fundamental. In recent work (Coarse-Grained Computation Between Dynamical Systems, Ruffini 2025), building on the dynamical-systems framework of Wolpert & Korbel (2025), I proposed a precise definition:
A dynamical system A is said to compute another dynamical system B if there exist coarse-grainings of both systems such that their induced (simplified) dynamics are isomorphic—that is, mathematically identical as dynamical systems:
C(A) ≅ C(B)
Informally: a coarse version of A mimics a coarse version of B.
That's it. No silicon required. No symbols. No programming language. Computation is a relation between dynamical systems at a given scale. When a computer simulates the ocean by solving discretized Navier–Stokes equations, it computes the ocean in exactly this sense: the coarse-grained dynamics of the machine's registers are isomorphic to the coarse-grained dynamics of the ocean's temperature, salinity, and velocity fields.
And here is the key insight: this definition is symmetric. If the computer computes the ocean, the ocean equally computes the computer (at the relevant coarse scale). Computation is not something that only silicon does. It is something that all dynamical systems participate in, whenever their coarse-grained behaviors match.
The deeper layer: dynamics is computation
The equivalence between dynamics and computation goes even deeper than the coarse-graining definition above. The Church–Turing thesis—the foundational conjecture at the heart of computer science—asserts that any effective procedure can be carried out by a Turing machine. Its physical counterpart, sometimes called the digital physics hypothesis, goes further: the dynamics of the physical world are themselves computable. If this is correct (and no known physical process has ever produced a counterexample), then the relationship between computation and dynamics is not merely an analogy—it is an identity.
Every physical dynamical system, from a neuron to a galaxy, evolves according to laws that a Turing machine can simulate. Conversely, every Turing machine computation unfolds as the physical dynamics of whatever hardware runs it. Dynamics and computation are two descriptions of the same thing, related by the very isomorphism that the coarse-graining definition makes precise. To say "consciousness is not computation" in this context is to say "consciousness is not physical dynamics"—a statement that no scientist studying brains would endorse.
This is not a philosophical curiosity. It means that the biology-versus-computation framing is not merely a false dichotomy—it is a category error. Biology is a particular kind of computation. Computation is the abstract structure of dynamics. They are not rival explanations. They live at different levels of description of the same underlying reality.
The false dichotomy
Once you see this, the claim "consciousness is not computation" collapses into "consciousness is not dynamics"—which no neuroscientist believes. Every brain is a physical dynamical system. Its neurons fire, its ion channels open and close, its oscillatory rhythms modulate one another. All of this is computation, in the precise mathematical sense: coarse-grained dynamical isomorphism with other systems that model those same processes.
The biology-versus-computation framing is a false dichotomy. Biology is a form of computation. The question was never "is consciousness computational?"—trivially, yes, because everything that evolves in time computes something. The question is: what kind of computation?
Saying "consciousness is not computation" is like saying "the weather is not dynamics." The sentence is grammatically well-formed, but it reveals a misunderstanding of the terms, not a deep truth about nature.
The real question: what kind of computation?
In the Kolmogorov Theory (KT) program, we replace the sterile "is it computation?" debate with a precise, graded question. An algorithmic agent—a system that we might meaningfully associate with experience—is a dynamical system that does three specific things:
First, it runs a compressive, informative world model: an internal process that shares non-trivial mutual algorithmic information with the world. Not a lookup table, not a passive recording—a model that captures the generating structure of its environment, so that it can generalize to new situations. Second, it evaluates states through a non-trivial objective function: it cares about some states more than others. Third, it selects actions via counterfactual planning: it doesn't just react, it simulates what would happen under alternative actions and chooses accordingly.
This is what separates a thermostat from a rock, an E. coli from a raindrop, and a bee from a weathervane. Not "computation versus biology," but the type of computation: compressive modeling, evaluation, and planning.
A bee navigating a flower patch runs compressive models of spatial geometry. A cuttlefish camouflaging itself runs compressive models of its visual environment. An octopus unscrewing a jar runs a planning engine. These are not metaphors. They are precise claims about the coarse-grained dynamical structure of these systems, claims that are in principle empirically testable through what we call the compressibility gap: the difference in algorithmic complexity of a system's output with and without the agent coupled to it.
Where Godfrey-Smith goes right
There is something correct in the intuition that biology matters. In the KT framework, the substrate constrains which computations are physically realizable, how fast they run, and how they couple to the environment. A brain implemented in carbon chemistry has different noise profiles, different energy constraints, and different evolutionary affordances than one implemented in silicon. These physical details shape the character of the agent's models, objectives, and plans—and therefore the character of any structured experience that may arise.
But this is a point about the specific type of computation, not about whether computation is involved at all. It is like saying that the weather over Barcelona differs from the weather over Reykjavik because geography matters. True—but no one concludes that Reykjavik's weather is "not dynamics."
The source of the confusion
The confusion is, at root, linguistic. The word "computation" has at least three common meanings in circulation:
When a philosopher says "consciousness is not computation," they usually mean (1). When a mathematician hears "computation," they mean (2) or (3). The resulting debate generates enormous heat and essentially no light, because the two sides are talking past each other about a mathematical term.
The fix is simple: define your terms. Once you do, the debate reframes itself. The question is not whether brains compute—they do, by definition, because they are dynamical systems whose dynamics are computable. The question is what they compute, how compressively, and to what end.
Those are questions worth asking. "Is consciousness computation?" is not.
References
Godfrey-Smith, P. (2026). "Studies on animal minds suggest consciousness is not computation." Institute of Art and Ideas.
Ruffini, G. (2025). "Coarse-Grained Computation Between Dynamical Systems." Working Paper WP0049, Barcelona Computational Foundation.
Ruffini, G. (2025). "What is Computation? II." Working Paper WP0054, Barcelona Computational Foundation.
Ruffini, G. (2025). "Mathematical Foundations of the Algorithmic Agent." Working Paper WP0018, Barcelona Computational Foundation.
Ruffini, G. (2026). "Mathematical Foundations of the Algorithmic Agent (v2)." Working Paper WP0062, Barcelona Computational Foundation.
Wolpert, D. & Korbel, J. (2025). Dynamical-systems framework for physical computation.