Quick: what’s 4 + 5? Nine right? Slightly less quick: what’s five plus four? Still nine, right?
Okay, let’s wait a few seconds. Bear with me. Feel free to have a quick stretch.
Now, without looking, what was the answer to the first question?
It’s still nine, isn’t it?
You’ve just performed a series of advanced brain functions. You did math based on prompts designed to appeal to entirely different parts of your brain and you displayed the ability to recall previous information when queried later. Great job!
This might seem like old hat to most of us, but it’s actually quite an amazing feat of brain power.
And, based on some recent research by a pair of teams from the University of Bonn and the University of Tübingen, these simple processes could indicate that you’re a quantum computer.
Let’s do the math
Your brain probably isn’t wired for numbers. It’s great at math, but numbers are a relatively new concept for humans.
Numbers showed up in human history approximately 6,000 years ago with the Mesopotamians, but our species has been around for about 300,000 years.
Prehistoric humans still had things to count. They didn’t randomly forget how many children they had just because there wasn’t a bespoke language for numerals yet.
Instead, they found other methods for expressing quantities or tracking objects such as holding up their fingers or using representative models.
If you had to keep track of dozens of cave-mates, for example, you might carry a pebble to represent each one. As people trickled in from a hard day of hunting, gathering, and whatnot, you could shift the pebbles from one container to another as an accounting method.
It might seem sub-optimal, but the human brain really doesn’t care whether you use numbers, words, or concepts when it comes to math.
Let’s do the research
The aforementioned research teams recently published a fascinating paper titled “Neuronal codes for arithmetic rule processing in the human brain.”
As the title intimates, the researchers identified an abstract code for processing addition and subtraction inside the human brain. This is significant because we really don’t know how the brain handles math.
You can’t just slap some electrodes on someone’s scalp or stick them in a CAT scan machine to suss out the nature of human calculation.
Math happens at the individual neuron level inside the human brain. EKG readings and CAT scans can only provide a general picture of all the noise our neurons produce.
And, as there are some 86 billion neurons making noise inside our heads, those kinds of readings aren’t what you’d call an “exact science.”
The Bonn and Tübingen teams got around this problem by conducting their research on volunteers who already had subcranial electrode implants for the treatment of epilepsy.
Nine volunteers met the study’s criteria and, because of the nature of their implants, they were able to provide what might be the world’s first glimpse into how the brain actually handles math.
Per the research paper:
We found abstract and notation-independent codes for addition and subtraction in neuronal populations.
Decoders applied to time-resolved recordings demonstrate a static code in hippocampus based on persistently rule-selective neurons, in contrast to a dynamic code in parahippocampal cortex originating from neurons carrying rapidly changing rule information.
Basically, the researchers saw that different parts of the brain light up when we do addition than when we do subtraction. They also discovered that different parts of the brain approach these tasks with different timing.
It’s a bit complex, but the gist of it is that one part of our brain tries to figure out the problem while another works on a solution.
As the researchers put it:
Neuron recordings in human and nonhuman primates, as well as computational modeling, suggest different cognitive functions for these two codes for working memory: although a dynamic code seems to suffice for short maintenance of more implicit information in memory, the intense mental manipulation of the attended working memory contents may require a static code.
Following this logic, parahippocampal cortex may represent a short-term memory of the arithmetic rule, whereas downstream hippocampus may “do the math” and process numbers according to the arithmetic rule at hand.
Let’s take inventory
So far we’ve learned that every math process requires both a hard-coded memory solution (a static rule) and a novel one (a dynamic rule). And each of those is transient based on what kind of arithmetic we’re performing.
Keeping in mind that there are 86 billion neurons in the human brain, and that something as basic as simple arithmetic appears to be hidden across all or most of them, it’s obvious there’s something more complex than simple pebble-counting going on.
Per the paper:
Mental calculation is a classic working memory task, and although working memory has traditionally been attributed to the prefrontal cortex, more recent data suggest that the MTL may also be important in working memory tasks and that it is part of a brain-wide network subserving working memory.
Either our brains are working extra-hard to do simple binary mathematics or they’re quantum computing systems doing what they do best: hallucinating answers.
The art of math
Think about an apple. No, not that one. Think about a green apple. How many calculations did it take for you to arrive at a specific apple density and relative size? Did you have to adjust the input variables in order to produce an apple that wasn’t red?
I’m going to go out on a limb and say you didn’t. You just thought about some apples and they happened inside your head. You hallucinated those apples.
Artificial intelligence systems designed to produce original content based on learned styles go through the exact same process.
These AI systems aren’t using advanced math features to psychologically exploit the human propensity for art or imagery. They’re just following some simple rules and swirling data around until they spit out something their creators will reward them for.
That’s kind of how your brain does math. At least according to this new research, anyway. It uses rules to surface the answer that makes the most sense. There’s a part that tries to get the “correct” solution based on things that never change (one plus one always equals two) and another part that tries to guess based on intuition when the answer isn’t something we have memorized.
And that’s why two humans of relative intelligence and education can perceive the same scene differently when it comes to processing math. Can you guess how many candies are in the jar below?
What does it all mean?
That remains to be seen. The simple fact that scientists were able to observe individual neurons participating in the math process inside human brains is astounding.
But it could take years of further research to understand the ramifications of these findings. First and foremost, we have to ask: is the human brain a quantum computer?
It makes sense, and this research might give us our first actual glimpses at a quantum function inside the human brain. But, as far as we can tell, they were only able to record and process hundreds of neurons at a time. That’s obviously a very tiny drop from a giant bucket of data.
To help with that, the researchers created an artificial intelligence system to interpret the data in a more robust manner. The hope is that continued research will lead to a greater understanding of math processes in the brain.
Per the paper’s conclusion:
More fine-grained analyses, ideally combined with perturbation approaches, will help to decipher the individual roles of brain areas and neuronal codes in mental arithmetic.
Yet, there could be potential implications on a much grander scale. The researchers don’t mention the ramifications for technology in their biology experiment or directly discuss its results in quantum computing terms.
But, if this research is accurate, Occam’s Razor tells us that the human brain is probably a quantum computer. Either that, or it’s poorly-designed.
Just like our prehistoric ancestors would have carved notches on the handles of their tools to keep track of objects, a binary brain should be able to handle counting objects through localized abstraction mechanisms.
Why go through all the trouble of hallucinating an answer across myriad neuronal complexes when individual neurons could just pretend to be ones and zeros like a binary computer?
The answer may lie in the quantum nature of the universe. When you perform a simple math function, such as adding two plus two, your brain may hallucinate all of the possible answers at once while simultaneously working to both remember the answer (you’ve definitely added those numbers before) and to process the data (1+1+1+1).
If the human brain were binary, you’d probably have to wait for it to go through each permutation individually instead of hallucinating them all at once.
The result is that you’re probably answering the question in your head before you can actively recognize that you’re thinking about it because both functions occur simultaneously. That’s called quantum time travel.
Get the TNW newsletter
Get the most important tech news in your inbox each week.