Basically, people rely on other people as a source of information and support for their own views/beliefs/sense of self-worth, and Chapter 7 on social psychology will go into this in more detail. But, for now, it seems the more confident a person is, the more convincing they are and the more others tend to believe the claims they make. This has been demonstrated in a number of studies, including those conducted in the 1990s by Penrod and Custer, who focused on courtroom settings. These studies looked at the degree to which jurors were convinced by witness testimonies and found that jurors were far more likely to favor witnesses who came across as confident and assured than those who seemed nervous and hesitant or unsure of the details of their claim. This was obviously a worrying finding; the content of a testimony being less influential in determining a verdict than the manner in which it is delivered could have serious ramifications for the justice system. And there’s nothing to say it’s limited to a courtroom setting; who’s to say politics isn’t similarly influenced?
Modern politicians are media-trained so they can speak confidently and smoothly on any subject for prolonged periods without saying anything of value. Or worse, something downright stupid like, “They misunderestimated me” (George W. Bush), or, “Most of our imports come from overseas” (George W. Bush again). You’d assume that the smartest people would end up running things; the smarter a person is, the better job they’d be able to do. But as counterintuitive as it may seem, the smarter a person is, the greater the odds of them being less confident in their views, and the less confident they come across as being, the less they’re trusted. Democracy, everyone.
Intelligent sorts may be less confident because there can often be a general hostility to those of the intellectual persuasion. I’m a neuroscientist by training, but I don’t tell people this unless directly asked, because I once got the response, “Oh, think you’re clever, do you?”
Do other people get this? If you tell someone you’re an Olympic sprinter, does anyone ever say, “Oh, think you’re fast, do you?” This seems unlikely. But, regardless, I still end up saying things like, “I’m a neuroscientist, but it’s not as impressive as it sounds.” There are countless social and cultural reasons for anti-intellectualism, but one possibility is that it’s a manifestation of the brain’s egocentric or “self-serving” bias and tendency to fear things. People care about their social standing and well-being, and someone seeming more intelligent than them can be perceived as a threat. People who are physically bigger and stronger can certainly be intimidating, but it’s a known property. A physically fit person is easy to understand; they just go to the gym more, or have been doing their chosen sport for far longer, right? That’s how muscles and such work. Anyone could end up like them if they do what they did, if they had the time or inclination.
But someone who is more intelligent than you presents an unknowable quantity, and as such they could behave in ways that you can’t predict or understand. This means the brain cannot work out whether they present a danger or not, and in this situation the old “better safe than sorry” instinct is activated, triggering suspicion and hostility. It’s true that a person could also learn and study to become more intelligent as well, but this is far more complex and uncertain than physical improvement. Lifting weights gives you strong arms, but the connection between learning and intelligence is far more diffuse.
The phenomenon of less-intelligent people being more confident has an actual scientific name: the Dunning–Kruger effect. It is named for David Dunning and Justin Kruger of Cornell University, the researchers who first looked into the phenomenon, inspired by reports of a criminal who held up banks after covering his face with lemon juice, because lemon juice can be used as invisible ink, so he thought his face wouldn’t show up on camera.5
Just let that sink in for a moment.
Dunning and Kruger got subjects to complete a number of tests, but also asked them to estimate how well they thought they had done on the tests. This produced a remarkable pattern: those who performed badly on the tests almost always assumed they’d done much much better, whereas those who did well invariably assumed they’d done worse. Dunning and Kruger argued that those with poor intelligence not only lack the intellectual abilities, they also lack the ability to recognize that they are bad at something. The brain’s egocentric tendencies kick in again, suppressing things that might lead to a negative opinion of oneself. But also, recognizing your own limitations and the superior abilities of others is something that itself requires intelligence. Hence you get people passionately arguing with others about subjects they have no direct experience of, even if the other person has studied the subject all their life. Our brain has only our own experiences to go from, and our baseline assumptions are that everyone is like us. So if we’re an idiot . . .
The argument is that an unintelligent person actually cannot “perceive” what it is to be considerably more intelligent. It’s basically like asking a color-blind person to describe a red and green pattern.
It may be that an “intelligent” has a similar take on the world, but expressed in different ways. If an intelligent person thinks something was easy then they may assume everyone else finds it easy too. They assume their level of competence is the norm, so they assume their intelligence is the norm (and intelligent people tend to find themselves in jobs and social situations where they’re surrounded by other similar types, so they are likely to have a lot of evidence to support this).
But if intelligent people are generally used to learning new things and acquiring new information, they’re more likely to be aware that they don’t know everything and how much there is to know about any given subject, which would undercut confidence when making claims and statements.
For example, in science, you (ideally) have to be painstakingly thorough with your data and research before making any claims as to how something works. A consequence of surrounding yourself with similarly intelligent people means if you do make a mistake or a grandiose claim, they’re more likely to spot it and call you on it. A logical consequence of this would be a keen awareness of the things you don’t know or aren’t sure about, which is often a handicap in a debate or an argument.