Cerebral Hemispheres 2
NEUROSCIENTIFICALLY CHALLENGED

NEUROSCIENCE MADE SIMPLER

Neuromyths and the disconnect between science and the public


When the movie Lucy was released in the summer of 2014, it was quickly followed by a flurry of attention surrounding the idea that we only use 10% of our brains. According to this perspective, around 90% of our neurons lie dormant, all the while teasing us by reminding us that we have only achieved a small fraction of our human potential. In the movie, Scarlett Johansson plays a woman who takes an experimental new drug that makes her capable of using 100% of her brain. Due to this sudden enhancement in brain utilization, she develops unprecedented cognitive abilities, as well as some extrasensory capabilities like telepathy and telekinesis. Thus, the plot of the movie actually hinges on the accuracy of the 10% of the brain idea. Unfortunately, it is an idea that has at this point been thoroughly debunked. In truth, it appears that all of our brain is active fairly constantly. Although some areas may be more active at times than others, there are no areas that ever go completely dark.

Despite this understanding of full-brain utilization being widely endorsed throughout the scientific community, the 10% of the brain myth is still accepted as true by a significant proportion of the public (which might explain why Lucy was able to rake in close to $460 million worldwide). In fact, a recent survey of educators in five different countries--the United Kingdom, The Netherlands, Turkey, Greece, and China--found that the percentage of teachers who believe in the 10% of the brain myth ranges from a low of 43% in Greece to a high of 59% in China.

The same survey identified a number of other inaccurate beliefs about the brain held by educators--beliefs that have come to be categorized as neuromyths. For example, 44-62% of teachers believed that consuming sugary drinks and snacks would be likely to affect the attention level of their students. The idea that sugar intake can decrease children's attention and increase their hyperactivity has been around since the 1970s. The first formal study to identify a relationship between sugar and hyperactivity was published in 1980, but it was an observational study without the ability to make any determination of a causal relationship. Since then more than a dozen placebo-controlled studies have been conducted, but a relationship between sugar and hyperactivity has not been supported. In fact, some studies found sugar to be associated with decreased activity. Yet, if you spend an afternoon at a toddler's birthday party, your chances of overhearing a parent attributing erratic toddler behavior to cake are around 93% (okay, that's a made-up statistic but the chances are high).

According to the survey, a large percentage (ranging from 71% in China to 91% in the U.K.) of teachers in the countries mentioned above also believe that hemispheric dominance is an important factor in determining individual differences in learning styles. In other words, they believe that some people think more with their "left brain" and others more with their "right brain," and that this lateralization of brain function is reflected in personalities and learning styles. The concept of hemispheric dominance has been an oft-discussed one in neuroscience since the 1860s when Paul Pierre Broca identified a dominant role for the left hemisphere (in most individuals) in language. Since the middle of the twentieth century, however, the concept of hemispheric dominance has been extrapolated to a number of functions other than language. Many now associate the right side of the brain with creativity and intuitive thinking, while the left side of the brain is linked to analytical and logical thought. By extension, people who are creative are sometimes said to be more "right-brained," while those who are more analytical are said to be "left-brained."

These broad characterizations of individuals using one side of their brain more than the other, however, are not supported by research. Although there are certain behaviors that seem to rely more on one hemisphere than the other--language and the left hemisphere being the best example--overall the brain functions as a whole and in general an individual doesn't preferentially use one hemisphere more based on his personality. Still, this myth has become so pervasive that a recommendation that children be identified as right- or left-brained and teaching approaches be modified accordingly has even made its way into educational texts.

Where do neuromyths come from?

Neuromyths are not generally created nor spread with malicious intent. Although there may be instances where inaccurate neuroscience information is used by entrepreneurs hoping to convince the public of the viability of a dubious product, usually neuromyths arise out of some genuine scientific confusion. For example, the misconception that sugar causes hyperactivity in children was bolstered by a study that did detect such an effect. The scientific status of the hypothesis, however, was forced to remain in limbo for a decade until more studies could be conducted. Those subsequent, better-designed studies failed to find a relationship, but by that time the myth had taken on a life of its own. The fact that it was so easy for parents to mistakenly attribute poor behavior to a previous sugary treat helped to sustain and propagate the inaccuracy.

So, some neuromyths are born from a published scientific finding that identifies a potential relationship and grow simply because it then takes time--during which faulty information can spread--to verify such a finding. Many myths also arise, however, from inherent biases--both personal and cultural--or the misinterpretation of data. At times, the sheer complexity of the field of neuroscience may be a contributing factor, as it may cause people to seek out overly simplistic explanations of how the brain works. These oversimplifications can be alluring because they are easy to understand, even if at times they are not fully accurate. For example, the explanation of depression--a disorder with a complex, and still not understood etiology--as being attributable to an imbalance of one neurotransmitter (i.e. serotonin) was popular in scientific and public discourse for decades before being recognized as too simplistic. 

After the scientific community has untangled any confusion that may have led to the creation of a neuromyth, it still takes quite a long time for the myth to die out. In part, this is because there is a disconnect between the information that is readily available to the public and that which is accessible to the scientific community. Most of the general public do not read academic journals. So, if several studies that serve to debunk a neuromyth are published over the course of a few years, the public may be unaware of these recent developments until the findings are communicated to them in some other form (e.g. a website or magazine that has articles on popular science topics).

Eventually this knowledge does find its way into the public discourse, though, it's just a question of when. The 10% of the brain myth, for example, has probably lasted for at least a century. But by 2014 enough of the non-scientific community was aware of the inaccuracies in the plot of Lucy to raise something of an uproar about them. There are other neuromyths, however, that have more recently become part of public knowledge, and thus we are likely to see them come up again and again over the next several years before widespread appreciation of their erroneousness emerges.

The myth of three, a current neuromyth

One example of a (relatively) recently espoused neuromyth is sometimes referred to as the "myth of three." The underlying idea of the myth of three is that there is a critical period between birth and three years of age, during which most of the major events of early brain development occur. This is a time when there is extensive synaptogenesis--a term for the formation of new connections between neurons--occurring. According to the myth of three, if the external environment during this time isn't conducive to learning or healthy brain development, the effects can range from a missed--and potentially lost--opportunity for cognitive growth to irreversible damage. Conversely, by enriching a child's environment during these first three years, you can increase the chances he will grow up to be the next Doogie Howser, MD. In other words, ages 0 to 3 represent the most important years of a child's learning life and essentially determine the course for his or her future cognitive maturation. Hillary Clinton summarized the myth of three appropriately while speaking to a group of educators in 1997 when she said: “It is clear that by the time most children start preschool, the architecture of the brain has essentially been constructed."

There are several problems with the myth of three. The first is with the assumption that age 0-3 is the most critical learning period in our lifetime. This assumption is based primarily on evidence of high levels of synaptogenesis during this time, but it is not clear that this provides convincing support for the unrivaled importance of 0-3 as a learning period. Synaptic density does reach a peak during this time, but it then remains at a high level until around puberty, and some skills that we begin to learn between ages 0 to 3 continue to be improved and refined over the years. In fact, with certain skills (e.g. problem solving, working memory) it seems we are unable to attain true intellectual maturity until we have spent years developing them. Thus, it is questionable if high levels of synaptogenesis are adequate evidence to suggest age 0-3 is the most critical window for learning that we have in our lives.

Also, it is not clear that a disruption in learning during the "critical period" of age 0-3 would have widespread effects on brain function. There are critical or sensitive periods that have been identified for certain brain functions, and some--but not all--rely on external stimulation for normal development to occur. For example, there are aspects of vision that depend on stimulation from the external environment to develop adequately. Indeed, for a complex ability like vision there are thought to be different critical periods for different specific functions like binocular vision or visual acuity. Some of these critical periods do occur between birth and age three. However, some extend well past age 3 and do not correlate well with the period of high synaptogenesis thought to be so important by those who originally advocated the myth of three. Because these critical periods significantly differ by function, it is inaccurate to refer to age 0-3 as a critical period for brain development in general. Thus, deprivation of stimulation during this age range has the potential to affect certain functions, depending on the specific time and severity of the deprivation, but the type of widespread cognitive limitations implied by the myth of three do not seem likely to occur. Additionally, much of the data used to support the assertions of the critical period of 0-3 years involves severe sensory deprivation rather than missed learning opportunities, even though the myth of three has more frequently been used to warn us of the ramifications of the latter.

Furthermore, a dearth of stimulation or lack of learning during a critical or sensitive period doesn't always translate into an irreparable deficit. For example, if a baby is not exposed to a language before 6 months of age, she will have a harder time distinguishing the eccentricities in speech sounds that make up the language. However, the fact that many adults are able to acquire a second language without having been exposed to it before 6 months of age suggests this doesn't translate into an inability to ever learn the language; it just makes learning that language more difficult. So, even when the environmental conditions aren't conducive to the development of a specific function, it seems our brain is still capable of rescuing that function if learning is resumed later in life.

Additionally, while improving the environment of a child growing up in some form of severe deprivation is beneficial, it is not clear how much enriching the environment of a child already growing up in good conditions will hasten brain development. Yet, this principle forms the cornerstone of a multimillion dollar industry. That industry markets to parents, advertising ways they can raise their infants' IQs by exposing them to things like Baby Mozart CDs, with the hopes that exposure to the music of Mozart will in some subliminal way lay the foundation for more rapid intellectual growth. Of course, there could be more harmful ramifications of misunderstood science than making parents more invested in their child's intellectual development. But sometimes that investment comes with a good dose of anxiety, high expectations, and wasted resources that could have been better used in other ways.

The myth of three, however, was not started by companies marketing goods to produce baby geniuses, this only helped to propagate it. The myth of three is a good example of how legitimate scientific confusion can engender the development of a neuromyth. Our understanding of early brain development, sensitive periods, and the best age for educational interventions is still evolving, and the details continue to be debated. And, because early interventions do seem to be beneficial for children raised in impoverished conditions, there was a plausible reason people expected environmental enrichment could augment the development of already healthy children. The fact that the myth is partially based in truth and that some of the answers are not yet clear makes it likely that this will be an especially persistent belief.

Where do we go from here?

As can be seen from the proliferation of the myth of three, it seems our awareness of the potential for neuromyths to develop is not enough to stop them from doing so. So how can we at least reduce the impact of these inaccurate beliefs? One way is by improving the neuroscientific literacy of our public; a step toward accomplishing this involves increasing the neuroscientific knowledge of our educators. There is a new field emerging to address the disconnect between recent neuroscience research and the information possessed by educators, although it is so new that is still awaiting a name (neuroeducation is one possibility).

Hopefully, as the field of neuroscience itself also grows, exposure to and understanding of neuroscientific topics among the public will increase. This may make it more difficult for ideas like the 10% of the brain myth to maintain a foothold. It may be impossible to fully eradicate the existence of neuromyths, as they are often based on legitimate scientific discoveries that are later found to be specious (and of course we would need scientific perfection to ensure that false leads are never created). However, awareness of the potential for erroneous beliefs to spread along with an increased understanding of how the brain works may serve to decrease the prevalence of neuromyths.

Bruer, JT. (1998). The brain and child development: time for some critical thinking. Public Health Reports, 113 (5), 388-397.

Howard-Jones, P. (2014). Neuroscience and education: myths and messages Nature Reviews Neuroscience, 15 (12), 817-824 DOI: 10.1038/nrn3817

YOUR BRAIN, EXPLAINED

Sleep. Memory. Pleasure. Fear. Language. We experience these things every day, but how do our brains create them? Your Brain, Explained is a personal tour around your gray matter. Building on neuroscientist Marc Dingman’s popular YouTube series, 2-Minute Neuroscience, this is a friendly, engaging introduction to the human brain and its quirks using real-life examples and Dingman’s own, hand-drawn illustrations.

  • ...a highly readable and accessible introduction to the operation of the brain and current issues in neuroscience... a wonderful introduction to the field. - Frank Amthor, PhD, Professor of Psychology, The University of Alabama at Birmingham, author, Neuroscience for Dummies

  • Dingman weaves classic studies with modern research into easily digestible sections, to provide an excellent primer on the rapidly advancing field of neuroscience. - Moheb Costandi, author, Neuroplasticity and 50 Human Brain Ideas You Really Need to Know

  • An informative, accessible and engaging book for anyone who has even the slightest interest in how the brain works, but doesn’t know where to begin. - Dean Burnett, PhD, author, Happy Brain and Idiot Brain

  • Reading like a collection of detective stories, Your Brain, Explained combines classic cases in the history of neurology with findings stemming from the latest techniques used to probe the brain’s secrets. - Stanley Finger, PhD, Professor Emeritus of Psychological & Brain Sciences, Washington University (St. Louis), author, Origins of Neuroscience

BIZARRE

This book shows a whole other side of how brains work by examining the most unusual behavior to emerge from the human brain. In it, you'll meet a woman who is afraid to take a shower because she fears her body will slip down the drain, a man who is convinced he is a cat, a woman who compulsively snacks on cigarette ashes, and many other unusual cases. As uncommon as they are, each of these cases has something important to teach us about everyday brain function.

  • Through case studies of both exceptional people as well as those with disorders, Bizarre takes us on a fascinating journey in which we learn more about what is going on in our skull. - William J. Ray, PhD, Emeritus Professor of Psychology, The Pennsylvania State University, author, Abnormal Psychology

  • A unique combination of storytelling and scientific explanation that appeals to the brain novice, the trained neuroscientist, and everyone in between. Dingman explores some of the most fascinating and mysterious expressions of human behavior in a style that is case study, dramatic novel, and introductory textbook all rolled into one. - Alison Kreisler, PhD, Neuroscience Instructor, California State University, San Marcos

  • Dingman brings the history of neuroscience back to life and weaves in contemporary ideas seamlessly. Readers will come along for the ride of a really interesting read and accidentally learn some neuroscience along the way. - Erin Kirschmann, PhD, Associate Professor of Psychology & Counseling, Immaculata University

  • Bizarre is a collection of stories of how the brain can create zombies, cult members, extra limbs, instant musicians, and overnight accents, to name a few of the mind-scratching cases. After reading this book, you will walk away with a greater appreciation for this bizarre organ. If you are a fan of Oliver Sacks' books, you're certain to be a fan of Dingman's Bizarre. - Allison M. Wilck, PhD, Researcher and Assistant Professor of Psychology, Eastern Mennonite University