For the record, scientists don’t have some sort of bizarre grudge against wine tasters, it’s just that there aren’t many professions that rely on a well-developed sense of taste to such an extent. And it’s not that they’re lying; they are almost certainly experiencing the tastes they claim to, but these are mostly the results of expectation, experience and the brain having to get creative, not the actual taste buds. Wine tasters may still object to this constant undermining of their discipline by neuroscientists.
The fact is that tasting something is, in many cases, something of a multisensory experience. People with nasty colds or other nose-clogging maladies often complain about being unable to taste food. Such is the interaction of senses determining taste that they tend to intermingle quite a lot and confuse the brain, and taste, as weak as it is, is constantly being influenced by our other senses, the main one being, you’ve guessed it, smell. Much of what we taste is derived from the smell of what we’re eating. There have been experiments where subjects, with their noses plugged and wearing blindfolds (to rule out vision’s influence, too), were unable to discern between apples, potatoes and onions if they had to rely on taste alone.4
A 2007 paper by Malika Auvray and Charles Spence5 revealed that if something has a powerful smell while we’re eating it the brain tends to interpret that as a taste, rather than an odor, even if it’s the nose relaying the signals. The majority of the sensations are in the mouth, so the brain overgeneralizes and assumes that’s where everything is coming from and interprets signals accordingly. But the brain already has to do a lot of the work in generating taste sensations, so it would be churlish to begrudge it making inaccurate assumptions.
The take-home message from all of this is that if you’re a bad cook, you can still get away with dinner parties if your guests are suffering from terrible head colds and willing to sit in the dark.
Come on, feel the noise
(How hearing and touch are actually related)
Hearing and touch are linked at a fundamental level. This is something most people don’t know, but think about it; have you ever noticed how incredibly enjoyable it can be to clean out your ear with a cotton swab? Yes? Well, that’s nothing to do with this, I’m just establishing the principle. But the truth is, the brain may perceive touch and hearing completely differently, but the mechanisms it uses to perceive them at all have a surprising amount of overlap.
In the previous section, we looked at smell and taste, and how they often overlap. Admittedly, they do often have similar roles regarding recognizing foodstuffs, and can influence each other (smell predominately influencing taste), but the main connection is that smell and taste are both chemical senses. The receptors for taste and smell are triggered in the presence of specific chemical substances, like fruit juice or gummy bears.
By contrast, touch and hearing; what do they have in common? When was the last time you thought something sounded sticky? Or “felt” high-pitched? Never, right?
Actually, wrong. Fans of the louder types of music often enjoy it at a very tactile level. Consider the sound systems you get in clubs, cars, concerts and so forth that amplify the bass element of music so much that it makes your fillings rattle. When it’s powerful enough or of a certain pitch, sound often seems to have a very “physical” presence.
Hearing and touch are both classed as mechanical senses, meaning they are activated by pressure or physical force. This might seem weird, given that hearing is clearly based on sound, but sound is actually vibrations in the air that travel to our eardrum and cause it to vibrate in turn. These vibrations are then transmitted to the cochlea, a spiral-shaped fluid-filled structure, and thus sound travels into our heads. The cochlea is quite ingenious, because it’s basically a long, curled-up, fluid-filled tube. Sound travels along it, but the exact layout of the cochlea and the physics of soundwaves mean the frequency of the sound (measured in hertz, Hz) dictates how far along the tube the vibrations travel. Lining this tube is the organ of Corti. It’s more of a layer than a separate self-contained structure, and the organ itself is covered with hair cells, which aren’t actually hairs, but receptors, because sometimes scientists don’t think things are confusing enough on their own.
These hair cells detect the vibrations in the cochlea, and fire off signals in response. But the hair cells only in certain parts of the cochlea are activated due to the specific frequencies traveling only certain distances. This means that there is essentially a frequency “map” of the cochlea, with the regions at the very start of the cochlea being stimulated by higher-frequency soundwaves (meaning high-pitched noises, like an excited toddler inhaling helium) whereas the very “end” of the cochlea is activated by the lowest-frequency soundwaves (very deep noises, like a whale singing Barry White songs). The areas between these extremes of the cochlea respond to the rest of the spectrum of sounds audible to humans (between 20 Hz and 20,000 Hz).
The cochlea is innervated by the eighth cranial nerve, named the vestibulocochlear nerve. This relays specific information via signals from the hair cells in the cochlea to the auditory cortex in the brain, which is responsible for processing sound perception, in the upper region of the temporal lobe. And the specific part of the cochlea the signals come from tells the brain what frequency the sound is, so we end up perceiving it as such, hence the cochlea “map.” Quite clever really.
The trouble is, a system like this, involving a very delicate and precise sensory mechanism essentially being shaken constantly, is obviously going to be a bit fragile. The eardrum itself is made up of three tiny bones arranged in a specific configuration, and this can often be damaged or disrupted by fluid, ear wax, trauma, you name it. The ageing process also means the tissues in the ear get more rigid, restricting vibrations, and no vibrations means no auditory perception. It would be reasonable to say that the gradual age-related decline of the hearing system has as much to do with physics as biology.
Hearing also has a wide selection of errors and hiccups, such as tinnitus and similar conditions, that cause us to perceive sounds that aren’t there. These occurrences are known as endaural phenomena; sounds that have no external source, caused by disorders of the hearing system (for example, wax getting into important areas or excessive hardening of important membranes). These are distinct from auditory hallucinations, which are more the result of activity in the “higher” regions of the brain where the information is processed rather than where it originates. They’re usually the sensation of “hearing voices” (discussed in the later section on psychosis), but other manifestations are musical ear syndrome, where sufferers hear inexplicable music, or the condition where sufferers hear sudden loud bangs or booms, known as exploding head syndrome, which is one from the category “conditions that sound far worse than they actually are.”