IN THIS LESSON
A Machine That Wonders.
To say that the brain computes is no longer revolutionary. What is revolutionary is how it does so, with wet matter, ionic gradients, and the unceasing whispers of electromagnetic pulses. The brain, that curious organ of thought, memory, and the occasional poor decision, is arguably the most sophisticated computational device the universe has ever assembled, so far.
In contrast to silicon-based machines built in climate-controlled laboratories, the biological computer in your skull runs on saltwater, proteins, and entropy. Its processors are cells. Its logic gates are ion channels. Its architecture is messy, asymmetric, and, somehow, flawlessly resilient. The brain’s computations do not merely solve mathematical problems. They infer intentions, rewrite memories, plan futures, and, on occasion, forget birthdays.
Yet to understand this computational prowess, we must first wrestle with a subtler, deeper concept: information. Not the sort printed in newspapers or stored in USB drives, but the raw, probabilistic kind, measured, quantified, and prized by physicists and theorists alike. Information, in this sense, is not about meaning. It is about surprise.
The father of this insight was Claude Shannon, whose Mathematical Theory of Communication established the field of information theory. His genius was to strip meaning from message, and measure instead its unexpectedness. A predictable message, after all, carries little information, while one that upends your expectations is brimming with it. Shannon gave us the “bit”, the binary digit, and proved that information is not merely transmitted or lost, it costs energy to process.
This, curiously, brings us to the Second Law of Thermodynamics.
Enter stage left: Maxwell’s demon. This hypothetical imp sits astride two chambers of gas and sorts molecules by speed, reducing disorder, thus violating the sacred second law. Or does it? Upon closer scrutiny, physicists realised the demon must expend effort, information processing, to achieve this sorting. And in doing so, it deletes past data, increasing entropy elsewhere. The universe wins again.
Here lies the exquisite link: information is not free. Processing it, storing it, forgetting it, these actions are physical, tangible, and energy-dependent. Information, like mass or charge, has a cost.
Now let us return to the brain.
The brain does not just process information. It transforms it. Patterns of ion movement across neuronal membranes, governed by stochastic probabilities and membrane potentials, are assembled into perceptions, predictions, and philosophies. This transformation of entropy into order is not magic; it is computation. And every time you make a decision, recall a face, or contemplate the future of artificial intelligence, your brain performs a cascade of logical operations that would humble any supercomputer.
It is no coincidence, then, that modern computing owes its conceptual roots to biological inspiration. Alan Turing, who formalised the very notion of computation in the 1930s, asked not just whether machines could compute, but whether they could think. His famous test, the Turing Test, sought not hardware perfection, but behavioural indistinguishability.
But even Turing, brilliant as he was, stood on older shoulders. Muhammad ibn Musa al-Khwarizmi, the polymath of 9th-century Baghdad, laid the foundations of algorithmic thinking centuries prior. His systematic procedures, al-jabr, are not only linguistic ancestors of algebra, but intellectual forerunners of the programming logic we apply in our algorithms today.
From neurons to algorithms, from entropy to intelligence, a thread runs through it all: the idea that thinking, whether human, animal, or artificial, is computational. It is not the cold arithmetic of calculators, but the structured, dynamic modulation of information. It is the fine-tuned balancing act between predictability and surprise.
And so, when we call the brain a “computational machine”, we are not demoting it. We are elevating computation. We are acknowledging that behind every flash of intuition, every burst of creativity, lies a mathematically coherent, physically grounded process of information flow.
But let us be clear: to compute is not to feel. To calculate is not to understand. The mind, emergent from the brain, is more than its sum of firings and feedback loops. It is where physics meets philosophy, where entropy meets ethics.
That we are capable of building computers in our own image, and now teaching them to mimic us, is a testament not to our machines, but to our biology. The brain is a computer, yes. But it is also a storyteller, a builder of meaning, and, most ironically, the only machine known to ask whether it is a machine at all.
Listen up.
Your brain does not just think, it computes. In this chapter, we uncover how information, entropy, and electromagnetic signalling converge in neurons to process the world. From Shannon’s bits to Maxwell’s paradox and Turing’s logic, discover how biology became computation, and why every thought you have might just be an algorithm in action.
-
Shannon, C.E. (1948)
A Mathematical Theory of Communication – Bell System Technical JournalThe foundational work of modern information theory, introducing the bit, entropy in communication, and the mathematical framing of information.
Koch, C.
Biophysics of Computation: Information Processing in Single NeuronsA rigorous examination of how neurons perform computations, linking electrophysiology with information theory.
Turing, A.M. (1950)
Computing Machinery and Intelligence – MindA seminal paper posing the question, “Can machines think?”—and laying the groundwork for artificial intelligence.
MacKay, D.J.C.
Information Theory, Inference, and Learning AlgorithmsAn accessible yet mathematically robust text linking information theory with computational learning.
Friston, K. (2010)
The free-energy principle: a unified brain theory? – Nature Reviews NeuroscienceDescribes the brain as an inference machine that minimises surprise—an elegant merger of information theory and brain function.
Tononi, G. (2004)
An information integration theory of consciousness – BMC NeuroscienceProposes that the brain’s computational power is rooted in its capacity to integrate information.
Landauer, R. (1961)
Irreversibility and heat generation in the computing process – IBM Journal of Research and DevelopmentShows that erasing information generates heat, thereby tying computation to thermodynamics.
Barrett, L.F. & Satpute, A.B. (2013)
Large-scale brain networks in affective and social neuroscience: towards an integrative functional architecture of the brain – Current Opinion in NeurobiologyExplores how distributed neural systems perform integrative computational tasks.
Dennett, D.C.
From Bacteria to Bach and Back: The Evolution of MindsAn accessible philosophical exploration of how computation might underpin consciousness and intelligence.
Dehaene, S.
Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts
Offers empirical and computational insights into how the brain encodes and processes information.