In the age of Google, Siri, and Alexa, memory and knowledge can feel like relics of humanity — something AI and machine learning have made obsolete. With the world’s knowledge in the palm of our hands, shouldn’t we skip learning facts in favor of applied skills and creativity?
That may be precisely the wrong lesson to learn. Instead, science is increasingly helping us to appreciate just how vital memory and prior knowledge really are. With memory, we can learn and understand faster, and reason more effectively. Without it, we risk being overwhelmed by information faster than we can comprehend it.
So what is memory?
Memory is the faculty of the brain by which information is encoded, stored, and retrieved. Tennessee Williams put it better, though: “Life is all memory, except for the one present moment that goes by you so quickly you hardly catch it going.” Memory is the fundamental link between the present and the past — without memory there is no learning or progress. Memory is the reason our experiences matter, and the means by which they can affect our future.
Of course, not all memory is equally valuable. The kind of memory that supports learning and creativity is lasting: the knowledge you’ve learned is rapidly and automatically available when you need it, and transferable: it’s flexible and rich enough to be applicable to the real world.
We all have examples of this for things we’re passionate about — perhaps it’s the personal experience that makes you so effective at your job, or maybe the knowledge of your favorite sport that brings you a deeper appreciation for the twists and turns of a game.
In contrast, memorization has a bad reputation in education circles, where it’s often associated with cramming, mnemonics, and other misguided (yet common) study habits that lead to brittle, rapidly forgotten memories. In reality these habits form in reaction to an education system that orients around testing, and memory is caught in the middle of the ongoing debate.
Instead, think of memory as prior knowledge running in the background of your brain as it takes in, processes and stores new information throughout the day. A prior knowledge framework changes how new information is processed, improving reasoning, understanding and retention.
Fluent access to knowledge — more than intellect — is the main factor that distinguishes a talented expert from a layperson, and that helps us comprehend new information. Conversely, a lack of prior knowledge can be debilitating for even the smartest young learners. The effects of prior knowledge are compounding: the knowledge-rich get richer.
How knowledge is built (and memory lost)
Memory, then, is fundamentally important — which makes it so much more interesting that the human brain is built to forget. Without effective memorization, newly learned information disappears rapidly — as much as 86 percent of what we read or listen to is gone from our memory in a matter of days.
More than 100 million of us have taken an online course — but even for the 4 percent who complete them, there’s a big difference between earning a certificate and actually building a lasting set of useful knowledge.
This rapid decay isn’t necessarily a bad thing though. Our memory helps us model the world, predict the future, and guide our decision making, so our brains need some way to drop the irrelevant details and retain the most important information. Researchers have even suggested that specific neural systems exist to actively erase old memories.
In fact, forgetting is even crucial to building lasting memories — the strongest memories are built by letting the information fade from memory, and then trying to recall it right as it’s becoming difficult to do so.
And here there is both good and bad news. The good news is that our brains also have the ability to build lasting memory in the right circumstances — cognitive science gives us a roadmap for truly effective learning. The bad news is the most effective ways to learn — such as self-testing and spacing out learning over time — run counter to our natural intuition. If seeing or reading something once isn’t a reliable way to build lasting memory, why is that the most familiar way to learn?
This seems to be hardwired into our biology. Evidence from brain imaging studies tells us that our sense of how well we’ve learned something relies on different neural signals to those that actually predict whether we’ll be able to remember it later. This is why the feeling of fluency and understanding as you read a news article or watch a TV show can be deceptive — and why so often we can find ourselves struggling to explain it clearly to someone later.
Life is all memory
Tennessee Williams may have written this over 50 years ago but it’s never been more true. Just three decades ago 80 percent of the market value of the S&P 500 was held in tangible assets like land or inventory. Today, the reverse is true — 80 percent of the value is held in intangible assets such as patents and software.
The internet has also changed the sheer breadth of what we can choose to learn. Rote learning dates or multiplication tables may be outdated for many people — but it’s outdated precisely because memory is so vital that what we learn has a huge effect on our future lives.
For the college student developing analytical skills, the self-taught engineer, or the refugee children learning Turkish to help them integrate into school, the value of memory is not diminished in our information rich age — it’s more important than ever.