The other region was Wernicke’s area, identified by Carl Wernicke, in the temporal lobe region. This was credited with language comprehension. When we understand words, their meanings and numerous interpretations, this was the doing of Wernicke’s area. This two-component set-up is a surprisingly straightforward arrangement for the brain, and indeed the language system of the brain is actually considerably more complex. But, for decades, Broca’s and Wernicke’s areas were credited with speech processing.
To understand why, consider that these areas were identified in the nineteenth century, via studies of people who had suffered damage localized to these brain regions. Without modern technology such as scanners and computers, aspiring neuroscientists were reduced to studying unfortunate individuals with just the right sort of head injury. Not the most efficient method, but at least they weren’t inflicting these injuries on people themselves (as far as we know).
Broca’s and Wernicke’s areas were identified because damage to them caused aphasias, which are profound disruptions to speech and understanding. Broca’s aphasia, aka expressive aphasia, means someone cannot “produce” language. There’s nothing wrong with their mouth or tongue, they can still understand speech, they just can’t produce any fluid, coherent communication of their own. They may be able to utter a few relevant words, but long complex sentences are practically impossible.
Interestingly, this aphasia is often evident when speaking, or writing. This is important. Speech is aural and conveyed via the mouth; writing is visual and uses hands and fingers, but for both to be equally impaired means a common element is disrupted, which can be only the language processing, which must be handled separately by the brain.
Wernicke’s aphasia is essentially the opposite problem. Those afflicted don’t seem able to comprehend language. They can apparently recognize tone, inflection, timing and so on but the words themselves are meaningless. And they respond similarly, with long, complex-sounding sentences, but instead of “I went to the store, bought some bread,” it’s “I wendle to the do the store tore todayhayhay boughtage soughtage some read bread breed”; a combination of real and made-up words strung together with no recognizable linguistic meaning, because the brain is damaged in such a way that it cannot recognize language, so also can’t produce it.
This aphasia also often applies to written language, and the sufferers are generally unable to recognize any problem with their speech. They think they are speaking normally, which obviously leads to serious frustration.
These aphasias led to the theories about the importance of Broca’s and Wernicke’s areas for language and speech. However, brain-scanning technology has changed matters. Broca’s area, a frontal lobe region, is still important for processing syntax and other crucial structural details, which makes sense; manipulating complex information in real-time describes much frontal lobe activity. Wernicke’s area, however, has been effectively demoted due to data that shows the involvement of much wider areas of the temporal lobe around it in processing speech.2
Areas such as the superior temporal gyrus, inferior frontal gyrus, middle temporal gyrus and “deeper” areas of the brain including the putamen are all strongly implicated in speech processing, handling elements such as syntax, the semantic meaning of words, associated terms in memory, and so on. Many of these are near the auditory cortex, which processes how things sound, which makes sense (for once). Wernicke’s and Broca’s areas may not be as integral for language as first assumed, but they’re still involved. Damage to them still disrupts the many connections between language-processing regions, hence aphasias. But that language-processing centers are so widely spread throughout shows language to be a fundamental function of the brain, rather than something we pick up from our surroundings.
Some argue that language is even more neurologically important. The theory of linguistic relativity claims that the language a person speaks underlies their cognitive processing and ability to perceive the world.3 For instance, if people were raised to speak a language that had no words for “reliable,” then they would be unable to understand or demonstrate reliability, and thus be forced to find work as a real estate agent.
This is an obviously extreme example, and it’s hard to study because you’d need to find a culture that uses a language with some important concepts missing. (There have been numerous studies into more isolated cultures that have smaller ranges of labels for colors that argue they are less able to perceive familiar colors, but these are debatable.4). Still, there are many theories about linguistic relativity, the most famous of which is the Sapir–Whorf hypothesis.*
Some go further, claiming that changing the language someone uses can change how they think. The most prominent example of this is neuro-linguistic programming, NLP. NLP is a mishmash of psychotherapy, personal development and other behavioral approaches, and the basic premise is that language, behavior and neurological processes are all intertwined. By altering someone’s specific use and experience of language their thinking and behavior can be changed (hopefully for the better), like someone editing the code for a computer program to remove bugs and glitches.
Despite its popularity and appeal, there’s little evidence to suggest that NLP actually works, putting it in the realms of pseudoscience and alternative medicine. This book is filled with examples of how the human brain does its own thing despite everything the modern world can throw at it, so it’s hardly going to fall in line when faced with a carefully chosen turn of phrase.
However, NLP does often state that the non-verbal component of communication is very important, which is true. And non-verbal communication manifests in many different ways.
In Oliver Sacks’s seminal 1985 book The Man Who Mistook His Wife for a Hat,5 he describes a group of aphasia patients who cannot understand spoken language, who are watching a speech by the president and finding it hilarious, which is clearly not the intent. The explanation is that the patients, robbed of their understanding of words, have become adept at recognizing non-verbal cues and signs that most people overlook, being distracted by the actual words. The president, to them, is constantly revealing that he is being dishonest via facial tics, body language, rhythm of speech, elaborate gestures and so on. These things, to an aphasia patient, are big red flags of dishonesty. When coming from the most powerful man in the world, it’s either laugh or cry.