Learning myths are everywhere. Take a second to think, and you can probably come up with a few of your own. Or, maybe they’re so pervasive and widely accepted that you don’t even realise they’re myths. For example, the idea that we only use 10% of our brains? Myth. Some people are left-brained, while others are right-brained? Myth. People forget 20% of what they learn after a day and even more after a week? Myth. And these examples are only the tip of the iceberg.
As counterintuitive as it sounds, some learning myths make sense. Our brains are intricate, complex things, and we only understand a limited percentage of how they operate. Thus, theories and myths take hold to explain that which we don’t understand.
Still, we think now is the time to set the record straight, debunking these myths and many more. Let’s dig in and debunk five learning myths.
We’ve already covered this topic in detail in our recent article on whether learning styles are a myth or valid. So, we’ll keep this one blunt. A variety of studies and other reputable sources are unequivocal on this topic: learning styles are a myth.
Anyone who’s grown up believing in learning styles might find this hard to believe. You might even think of yourself as having a particular learning style. So, what does this mean?
According to a 2017 study published by Frontiers in Psychology: “this idea [learning styles] has been repeatedly tested and there is currently no evidence to support it.”
Further, a study of more than 100 higher education providers in the UK found that 90% think learning styles are conceptually flawed. Plus, 67% believe they pigeonhole learners, 61% think they waste resources, and 56% think they create unrealistic expectations.
We hear this concept all the time in popular culture — it has even been the basis for Hollywood movies such as Limitless and Lucy. However, the idea that we only use 10% of our brains is a total myth.
How do we know? Well, brain scans show exactly that. According to Very Well Mind, functional magnetic resonance imaging (fMRI) scans “make it clear that large regions of the brain are at work during all kinds of activity…researchers have not found any region of the brain that does not serve a function. A study of medical myths noted that ‘numerous types of brain imaging studies show that no area of the brain is completely silent or inactive.’”
While it’s a fun plot for a movie to imagine ‘unlocking’ the other 90% of our brains, that’s where the validity of this idea ends — of course we use significantly more than 10% of our brains!
The idea that people are either left-brained or right-brained is exceptionally popular. In a recent survey, 77% of teachers believe this myth. These results should be highly concerning, as we will explain shortly.
According to this theory, left-brained people are more logical, while right-brained people are more creative. This idea is so pervasive that you might even think of yourself as being either left or right-brained. Yet, once again, this idea is little more than a myth.
In a 2013 study by the University of Utah, scientists analysed over 1,000 brains and “found no evidence that people preferentially use the left or right hemisphere.”
LearnDash adds to this, explaining, “not only is there no scientific basis for the idea that left-brained people are more analytical while right-brained people are more creative, scientists haven’t even been able to find any evidence that people use one side of their brain more than the other at all.”
Another common learning myth is that Gen Z’s attention span is shrinking. You may have heard that attention spans have shrunk from 12 among Millennials to 8 seconds among Gen Z, on average.
In reality, this is a half-truth at best. According to a study by Altitude, members of Gen Z have developed sophisticated, eight-second ‘content filters’. Since they have grown up with access to so many platforms and so much information, Gen Z has evolved to quickly assess large quantities of information. As such, they excel at judging what is worthy of their attention and what isn’t.
Basically, Gen Z processes information differently due to their relationship with technology, using content filters to sift out irrelevant information and hone in on things that matter to them. Altitude’s study explains that “they’ve grown up in a world where their options are limitless but their time is not. As such, Gen Z have adapted to quickly sorting through and assessing enormous amounts of information.”
To learn more, see our blog on how technology has changed the way we learn.
You may have heard that you only remember 10% of what you read, 20% of what you hear, 30% of what you see, and so on. This theory is more commonly known as Dale’s Cone. While it sounds nice, this is a total myth. In reality, learning is rarely so neat and linear.
According to a study published in Educational Technology, “there is no scientific data — or other data — that supports the claim that people remember some percentage of what they learned.” Moreover, they add that “no body of research — scientific or otherwise — supports any variation of the remembering percentages.”
Not only that, but Dale’s Cone is actually a fundamental misunderstanding of Edgar Dale's original intent. According to PeopleMatters, “[Edgar Dale] hadn’t based his cone on any scientific research, which he explicitly mentioned in his study, besides urging his readers to 'not take it too seriously.'"
Other people later added percentages to Dale’s original cone, based on little to no credible research. PeopleMatters explains that these were “randomly assigned percentages to people’s retention abilities.” Thus, what was originally intended to be an interesting but unscientific theory has taken on a life of its own, leading to a gross and occasionally harmful misunderstanding of the original intent.
Ultimately, it makes sense that Dale’s Cone is a myth, as human learning and memory are so complex and variable as to preclude such neat, linear, round numbers.