We’ve all heard, “Oh, I’m no good at math. I’m right-brained!” Similarly, someone might say, “I can’t learn by listening. I need to see the information.” Or even, “Don’t bother. Old dogs can’t learn new tricks.”
These three statements have one thing in common: they’re all myths that can limit our learning potential if we believe them.
These ideas have been repeated so often they are now accepted as truth. But they’re misconceptions that can hurt our progress, give us a poor perception of our capabilities, and even keep us from learning something in the first place.
Let’s take a moment to dispel the three most popular but false beliefs about how we learn and what it takes to develop a skill.
Myth #1: Everyone Is Either a Left-Brained or Right-Brained Learner
Our first myth is the idea that each brain hemisphere is responsible for specific thinking modes. It’s a common belief that the “left brain” is responsible for logical tasks while the “right-brain” is in charge of creativity. That’s not exactly the case.
Though one hemisphere may take priority over the other in certain thinking processes, we use both sides of our brain for almost everything, including learning. That means none of us are “left-brained” or “right-brained,” and we shouldn’t buy into learning “techniques” that target our “dominant” side or block our “non-dominant” one.
Brain scans, neuroimaging, and research on brain damage, among other studies, have proved time and again that we use all brain areas and that they are often active, so we shouldn’t assume we’re inherently less capable of logic and reasoning or creative endeavors than anyone else.
Myth #2: We Have a Primary Learning Style
Our second myth is the idea that each of us has a primary learning style, and that we learn best when material is presented in alignment with it.
Many theories have stemmed from this concept. A well-known example is the “VAK/VARK learning styles” theory, which categorizes learners into either visual, auditory, (reading), or kinesthetic. Another one is the Honey-Mumford model, which divides learners into activists, reflectors, theorists, and pragmatists.
While all these theories propose different “styles,” they share the idea that we learn best if we study based on our dominant one, a premise unsupported by research. These theories come from observation and “experience” in classrooms, not from rigorous testing. There’s no evidence that we learn better if new material is presented in what we think is our style of learning.
What’s true is that we do have preferences in the way we learn, though this doesn’t mean our preferred style makes the most difference in our learning. Other factors, such as the type of subject we are studying, how we perceive ourselves and our capabilities, prior knowledge, and our ability to extract underlying principles from the material, play a far more important role in how we learn.
Myth #3: Old Dogs Can’t Learn New Tricks
Our third myth is a classic—and just as wrong as the first two.
Years ago, it was thought that the brain was flexible during our development years, (childhood and early teens), and mostly rigid throughout adulthood. In other words, that we were better learners early in life. However, while our brain’s flexibility does decrease with age, it keeps its ability to learn and rewire itself throughout our entire lives.
What does get worse with age is how good we are at rote learning, repeating information over and over until we commit it to memory. But rote learning is inefficient for memorizing in the first place—there are far better methods available—so we shouldn’t be too concerned with our diminished ability to rote learn later in life.
Despite the scientific findings, it still seems like young people learn faster. But the explanations are found in psychology and behavior rather than biological differences in age. For most adults, learning comes behind work, family, finances, and other responsibilities. But for many young people, learning is their main—if not only—responsibility, and they spend most of their time learning both at home and in school. That extra attention accounts for a great part of what seems like better learning abilities.
None of this is to say that age doesn’t play any part in learning. It does. Depending on how far we want to take certain skills, we’d better start early, when our bodies and minds are more adaptable. For someone wanting to be a top ballet dancer, for instance, it makes a difference to start as a kid. She or he can still learn the skill at any age, even become great at it—within the boundaries of their physicality—but shouldn’t expect to grace the stage of the Bolshoi Theater if they started late in life.
Don’t Let Myths Limit Your Potential
To learn throughout our lives and develop skills, whatever they may be, we shouldn’t let myths limit our potential. Believing them will have an adverse placebo effect, aka “nocebo effect,” hurting our confidence and progress as long as we believe them to be true.
Instead, we should know that our ability to learn is limited mainly by the time and effort we put into it. Even people with natural advantages have to work hard to become great. In other words, masters are made, not born, and by opening our minds to our full potential, we can achieve mastery ourselves.
For more advice on maximizing your learning capacity, you can find Learn, Improve, Master on Amazon.