The results you deserve👀
See for yourself
October 26, 2023
|
5 mins to read

5 of the most common myths about learning, debunked

Learning myths are pervasive and harmful, so in this blog we're debunking 5 of the most common misunderstandings about the way people learn.
Alex Mullen
Web Content Writer

Myths about learning are almost as commonplace as truths about learning - and seemingly even more pervasive.

Once a catchy idea takes hold of the public consciousness, it spreads like wildfire throughout online conversation, news publications and dinner parties - until almost everyone you know believes that the brain stops developing at the age of 25.

How do these myths get started in the first place? Why are we so quick to believe them? In this blog, we’ll be debunking five of the most common myths about learning.

As the world's fastest growing LMS we are very interested in the truth, so let's take a look into what's fact and what's fiction.

‍

Learning styles


For whatever reason, this one seems to be a particularly sticky theory; a 2020 study from Philip M. Newton and Atharva Salvi found that 89.1% of surveyed educators believed in matching instruction to learning styles.

Growing up you were most likely exposed to the theory that there are anywhere between three to seven distinct learning styles, and that in order to learn effectively, you need to know what yours is.

Visual? Kinesthetic? Auditory? Reading and Writing? The commonly accepted knowledge would have you believe that you fit into one of these four categories, and that this awareness is the key to learning - and retaining - new information.

In their 2008 paper, researchers Harold Pashler, Mark A. Mcdaniel, Doug Rohrer and Robert A. Bjork revealed:
‍

“If classification of students’ learning styles has practical utility, it remains to be demonstrated.”


As this paper and multiple others have found, there is absolutely no scientific evidence to substantiate the claim that everybody has their own distinct learning style. But why let a pesky little thing like scientific evidence get in the way of a fun theory that helps people categorise themselves? (Looking at you, Myers Briggs.)

In all seriousness, why has this theory pervaded so successfully?

It might be because it’s based on the fundamental truth that there is not just “one way” to learn. Everybody intakes information differently depending on a variety of factors. Cognitive strengths and weaknesses, prior knowledge, cultural or socioeconomic background, and metacognition - the ability to regulate one's own learning - are all part of a complex web that contributes to how someone learns.

Where it starts to fall apart is when the claim is made that in order to learn effectively, you need to be learning within your specific "style."

‍

The 10,000 hour rule


Originating with Swedish researcher Dr. Anders Ericsson and later popularised by 2000’s pop psychology poster boy Malcom Gladwell in his book Outliers, the 10,000 hour rule posits that it takes roughly 10,000 hours to become an expert in something. (Gladwell uses The Beatles, Bill Gates and Robert Oppenheimer as his prime examples.)

Surprise, surprise, there’s a theme emerging here: The original research by Dr. Ericsson has been taken out of context, gradually snowballing into a totally different idea. In his 1993 paper, Ericsson observed two violin virtuosos who had each practised for roughly ten years - or 10,000 hours - to reach their current level of expertise. He specifically cited one-on-one instruction, altered according to what the students were excelling in or struggling with, as a reason why they had reached expert level.

This clarification has been missing from the discourse around the "10,000 hour rule", which has come to mean that simply anyone can persist at simply anything and become good at it - as long as they put in the requisite 10,000 hours.

Since its 2008 publication, Outliers’ recounting of the "10,000 rule" has regularly been debated and disproven. Dr. Ericsson has commented that along with one-on-one, dedicated instruction, the figure of 10,000 hours isn’t accurate - just an average that most "expert" violinists had reached by the age of 20.

We can see why the idea has persisted. Who wouldn’t want to live in a world where all you need to write songs like McCartney, program computers like Gates or - erm - create atomic bombs like Oppenheimer, is simply time and the will to use it?

‍

Left-brain vs. right-brain learning


Most people are probably aware at this point that ‘left vs. right brain thinking’ is unfounded, but let’s go over it again: The theory goes that the human brain is split into two distinct hemispheres, each with their own set of functions, strengths and weaknesses.

While the right side is creative, emotional and thoughtful, the left is responsible for analytical and logical thinking. The theory then asserts that each individual is governed by their more dominant hemisphere - e.g. "left-brained people" are more logical and "right-brained people" are more creative. This then dictates how we learn, think and process new information.

So, how did this theory originally come about?

It seems to start with Nobel-prize-winning neuropsychologist Roger W. Sperry.

Sperry studied brain functioning in patients who had surgically severed the structure connecting the two hemispheres of the brain in order to treat refractory epilepsy. He then found found that different cognitive functions tended to be served by either the right or left side.

From here, his findings have been contorted into a much more simplified idea that different abilities or personality traits are governed by the different hemispheres of the brain.

Again the most likely reason for this theory spreading so effectively is, well 
 It's fun. It’s fun to imagine that our brains are neatly split into "creative" and "analytical" sections, and it gives us a sense of individuality to think that we are all governed by our dominant side. (Again - we love to classify ourselves: Myers Briggs, Learning Styles, Hogwarts houses
 whatever it might be, the thought that our sense of self is easily categorised is an appealing one.)

‍

Learning stops after a certain age


You have probably heard the not-so-fun "fact" that adult brains stop developing after the age of 25.

Mercifully, the idea that learning stops after a certain age has been disproven.

It is of course true that the first part of most people’s lives - childhood, adolescence and early adulthood - is where the bulk of their formal education takes place. As we age and enter the workplace, it’s perceived that the "learning" part of our life is over.

However, as a Learning Platform provider for business, we’re obviously pleased to say this isn’t true.

Although adulthood generally presents fewer formal education opportunities and cognitive functions can decline with age, the capacity to learn is still very much intact. Neuroplasticity - the brain’s ability to change and adapt with time - is a key factor in adult learning.

‍

People remember 10% of what they read, 20% of what they see, and 30% of what they hear


This theory is an interesting one, a learning industry Bigfoot of which nobody really knows the origin. It's not entirely clear where this statistic came from - but some believe it to be a butchering of Dale’s Cone of Experience.

The statistic goes that people only remember 10% of what they read, 20% of what they see, and 30% of what they hear.

But, as veteran workplace learning expert Will Thalheimer outlined (gulp, seventeen years ago now
) this theory is baseless. He explained in his analysis that the percentages are not based on scientific evidence, and have been continually repurposed and twisted to meet different agendas over the years. The different learning methods cannot even be compared, and fall apart under even the slightest amount of scrutiny. In his words:

‍

“It's not fair to compare these different methods by using the same test, because the choice of test will bias the outcome toward the learning situation that is most like the test situation.”

‍

‍

We hope you’ve enjoyed our debunking of five common learning myths.

There's as much misinformation as there is information on the topic of learning, and experimentation forms the basis of finding out the truth.

If you’re looking to learn more about the value of experimentation, sign up for Helen Marshall and Ian Blackburn’s webinar Experiment. Experiment. Experiment this afternoon at 2pm.

‍

More Stories

See all

See Thrive in action

Explore what impact Thrive could make for your team and your learners today.

October 26, 2023
|
5 mins to read

5 of the most common myths about learning, debunked

Learning myths are pervasive and harmful, so in this blog we're debunking 5 of the most common misunderstandings about the way people learn.
Alex Mullen
Web Content Writer

Myths about learning are almost as commonplace as truths about learning - and seemingly even more pervasive.

Once a catchy idea takes hold of the public consciousness, it spreads like wildfire throughout online conversation, news publications and dinner parties - until almost everyone you know believes that the brain stops developing at the age of 25.

How do these myths get started in the first place? Why are we so quick to believe them? In this blog, we’ll be debunking five of the most common myths about learning.

As the world's fastest growing LMS we are very interested in the truth, so let's take a look into what's fact and what's fiction.

‍

Learning styles


For whatever reason, this one seems to be a particularly sticky theory; a 2020 study from Philip M. Newton and Atharva Salvi found that 89.1% of surveyed educators believed in matching instruction to learning styles.

Growing up you were most likely exposed to the theory that there are anywhere between three to seven distinct learning styles, and that in order to learn effectively, you need to know what yours is.

Visual? Kinesthetic? Auditory? Reading and Writing? The commonly accepted knowledge would have you believe that you fit into one of these four categories, and that this awareness is the key to learning - and retaining - new information.

In their 2008 paper, researchers Harold Pashler, Mark A. Mcdaniel, Doug Rohrer and Robert A. Bjork revealed:
‍

“If classification of students’ learning styles has practical utility, it remains to be demonstrated.”


As this paper and multiple others have found, there is absolutely no scientific evidence to substantiate the claim that everybody has their own distinct learning style. But why let a pesky little thing like scientific evidence get in the way of a fun theory that helps people categorise themselves? (Looking at you, Myers Briggs.)

In all seriousness, why has this theory pervaded so successfully?

It might be because it’s based on the fundamental truth that there is not just “one way” to learn. Everybody intakes information differently depending on a variety of factors. Cognitive strengths and weaknesses, prior knowledge, cultural or socioeconomic background, and metacognition - the ability to regulate one's own learning - are all part of a complex web that contributes to how someone learns.

Where it starts to fall apart is when the claim is made that in order to learn effectively, you need to be learning within your specific "style."

‍

The 10,000 hour rule


Originating with Swedish researcher Dr. Anders Ericsson and later popularised by 2000’s pop psychology poster boy Malcom Gladwell in his book Outliers, the 10,000 hour rule posits that it takes roughly 10,000 hours to become an expert in something. (Gladwell uses The Beatles, Bill Gates and Robert Oppenheimer as his prime examples.)

Surprise, surprise, there’s a theme emerging here: The original research by Dr. Ericsson has been taken out of context, gradually snowballing into a totally different idea. In his 1993 paper, Ericsson observed two violin virtuosos who had each practised for roughly ten years - or 10,000 hours - to reach their current level of expertise. He specifically cited one-on-one instruction, altered according to what the students were excelling in or struggling with, as a reason why they had reached expert level.

This clarification has been missing from the discourse around the "10,000 hour rule", which has come to mean that simply anyone can persist at simply anything and become good at it - as long as they put in the requisite 10,000 hours.

Since its 2008 publication, Outliers’ recounting of the "10,000 rule" has regularly been debated and disproven. Dr. Ericsson has commented that along with one-on-one, dedicated instruction, the figure of 10,000 hours isn’t accurate - just an average that most "expert" violinists had reached by the age of 20.

We can see why the idea has persisted. Who wouldn’t want to live in a world where all you need to write songs like McCartney, program computers like Gates or - erm - create atomic bombs like Oppenheimer, is simply time and the will to use it?

‍

Left-brain vs. right-brain learning


Most people are probably aware at this point that ‘left vs. right brain thinking’ is unfounded, but let’s go over it again: The theory goes that the human brain is split into two distinct hemispheres, each with their own set of functions, strengths and weaknesses.

While the right side is creative, emotional and thoughtful, the left is responsible for analytical and logical thinking. The theory then asserts that each individual is governed by their more dominant hemisphere - e.g. "left-brained people" are more logical and "right-brained people" are more creative. This then dictates how we learn, think and process new information.

So, how did this theory originally come about?

It seems to start with Nobel-prize-winning neuropsychologist Roger W. Sperry.

Sperry studied brain functioning in patients who had surgically severed the structure connecting the two hemispheres of the brain in order to treat refractory epilepsy. He then found found that different cognitive functions tended to be served by either the right or left side.

From here, his findings have been contorted into a much more simplified idea that different abilities or personality traits are governed by the different hemispheres of the brain.

Again the most likely reason for this theory spreading so effectively is, well 
 It's fun. It’s fun to imagine that our brains are neatly split into "creative" and "analytical" sections, and it gives us a sense of individuality to think that we are all governed by our dominant side. (Again - we love to classify ourselves: Myers Briggs, Learning Styles, Hogwarts houses
 whatever it might be, the thought that our sense of self is easily categorised is an appealing one.)

‍

Learning stops after a certain age


You have probably heard the not-so-fun "fact" that adult brains stop developing after the age of 25.

Mercifully, the idea that learning stops after a certain age has been disproven.

It is of course true that the first part of most people’s lives - childhood, adolescence and early adulthood - is where the bulk of their formal education takes place. As we age and enter the workplace, it’s perceived that the "learning" part of our life is over.

However, as a Learning Platform provider for business, we’re obviously pleased to say this isn’t true.

Although adulthood generally presents fewer formal education opportunities and cognitive functions can decline with age, the capacity to learn is still very much intact. Neuroplasticity - the brain’s ability to change and adapt with time - is a key factor in adult learning.

‍

People remember 10% of what they read, 20% of what they see, and 30% of what they hear


This theory is an interesting one, a learning industry Bigfoot of which nobody really knows the origin. It's not entirely clear where this statistic came from - but some believe it to be a butchering of Dale’s Cone of Experience.

The statistic goes that people only remember 10% of what they read, 20% of what they see, and 30% of what they hear.

But, as veteran workplace learning expert Will Thalheimer outlined (gulp, seventeen years ago now
) this theory is baseless. He explained in his analysis that the percentages are not based on scientific evidence, and have been continually repurposed and twisted to meet different agendas over the years. The different learning methods cannot even be compared, and fall apart under even the slightest amount of scrutiny. In his words:

‍

“It's not fair to compare these different methods by using the same test, because the choice of test will bias the outcome toward the learning situation that is most like the test situation.”

‍

‍

We hope you’ve enjoyed our debunking of five common learning myths.

There's as much misinformation as there is information on the topic of learning, and experimentation forms the basis of finding out the truth.

If you’re looking to learn more about the value of experimentation, sign up for Helen Marshall and Ian Blackburn’s webinar Experiment. Experiment. Experiment this afternoon at 2pm.

‍

More Stories

See all

See Thrive in action

Explore what impact Thrive could make for your team and your learners today.