Why the idea of “learning styles” is basically bunk

I saw a wildly entertaining but painfully stupid movie recently, Luc Besson’s LUCY.  The premise of the film is that human beings only use 10% of their brain, and that a person who could unlock the additional 90% would be capable of astonishing, almost supernatural, feats.  Enter Scarlett Johansson’s Lucy, who, through a process I won’t bother getting into here, unlocks more and more of her brain, becoming more and more powerful, until she reaches 100% at the film’s climax.  By the end of the film she has achieved complete mastery over time and space; her consciousness transcends physical matter allowing her a kind of omnipresence which is manifested, among other way, through text messages on cellular phones.  (Yes, the movie is about as stupid as it sounds.)

LUCYThis being a Luc Besson film, there are also lots of gun battles and car chases through the streets of Paris, and at least one grizzled French cop with a heart of gold who tries to help Lucy battle the Taiwanese gangsters who, naturally, want to hunt her down and kill her.  The film also features Morgan Freeman in his standard herald role, this time as a professor of neurology whose job is to expound endless lines of pseudo-science justifying the movie’s ludicrous premise.  (“She has now unlocked 27% of her brain’s capacity, which is why she can read minds.” etc.)

Just in case you didn’t know, let’s make it clear: the idea that we only use 10% of our brains is a myth.  A complete, total, utter, 100% myth.  If you want to learn more about the myth itself and its origins, read this recent article from WIRED magazine.

I have often thought that the 10% myth has had resonance over the years because it gives people a kind of hope for future improvement. The fact that we actually use 100% of our brains already is, frankly, a bit depressing for those of us who aren’t artistic, musical, or scientific geniuses.  But the LUCY film made me think of another myth that continues to resonate in the popular imagination, and that is the myth of learning styles.

The myth, essentially, goes something like this: Everyone has his or her own “learning style”, which is the way that he/she learns best.  Some people learn best by reading, some by writing, some by listening, some by touching, some by tasting, and some by smelling.  (Okay, I’ll admit I added tasting and smelling, but it is the logical conclusion of the premise.  And if you think it’s absurd to hear educated people talk about “smelling numbers”, you have never had the misfortune of attending a learning styles conference.)

It should be self-evident that the style of learning that is most effective will vary not with the individual but with the task.  Math is best learned by solving math problems.  Engine repair is best learned by holding tools and working on an actual engine.  Charles Dickens’ A TALE OF TWO CITIES is best learned by reading it.  Political geography is best learned by looking at a map.  Baseball is best learned by picking up a bat and swinging it (or putting on a glove and throwing the ball).  You get the idea.

The problem with the idea of learning styles is that not only does it contradict the real-life experience of anyone who has ever tried to learn anything, but there is absolutely no evidence to support that it’s true; the best summation of the hypothesis and its poverty of empirical data is in this recent piece from Scientific American, which I urge you to read.

Of course, a complete and total lack of evidence doesn’t necessarily stop an idea from gaining prominence and influence in education circles; just ask the advocates of “multiple intelligence theory”–the close cousin of “learning styles”, and one just as poorly supported by the evidence.