Idiot Brain - What Your Head Is Really Up To

Intrinsic motivations are where we’re driven to do things because of decisions or desires that we come up with ourselves. We decide, based on what we’ve experienced and learned, that helping sick people is a noble and rewarding thing to do, so we’re motivated to study medicine and become a doctor. This is an intrinsic motivation. If we are motivated to study medicine because people pay doctors a lot of money, this is more an extrinsic motivation.

Intrinsic and extrinsic motivations exist in a delicate balance. Not only with each other, but within themselves as well. In 1988, Deci and Ryan came up with the self-determination theory, which describes what motivates people in the absence of any external influence, so is 100 percent intrinsic.16 It argues that people are motivated to achieve autonomy (control of things), competency (to be good at things) and relatedness (be recognized for what they do). All of these explain why micromanagers are so infuriating; someone hovering over your shoulder telling you precisely how to do the simplest task robs you of all control, undermines all notion of competence and is often impossible to relate to, given how most micromanagers seem sociopathic (if you’re at the mercy of one).

In 1973, Lepper, Greene and Nisbet pointed out the over-justification effect.17 Groups of children were given colorful art supplies to play with. Some were told they’d be rewarded for using them; others were left to their own devices. A week later, the children who weren’t rewarded were far more motivated to use the art supplies again. Those who decided that the creative activity was enjoyable and satisfying on their own experienced greater motivation than those who received rewards from other people.

It seems if we associate a positive outcome with our own actions, this carries more weight than if the positive outcome came from someone else. Who’s to say they won’t reward us next time? As a result, motivation is diminished.

The obvious conclusion is that rewarding people for a task can actually reduce motivation for doing it, whereas giving them more control or authority increases motivation. This idea has been picked up (with great enthusiasm) by the business world, largely because it lends scientific credibility to the idea that it’s better to give employees greater autonomy and responsibility than actually paying them for their labor. While some researchers suggest that this is accurate, there’s ample data against it. If paying someone to work reduces motivation, then top executives who get paid millions actually do nothing. Nobody is saying that though; even if billionaires aren’t motivated to do anything, they can afford lawyers who are.

The brain’s tendency towards ego can also be a factor. In 1987, Edward Tory Higgins devised the self-discrepancy theory.18 This argued that the brain has a number of “selves.” There’s the “ideal” self, which is what you want to be, derived from your goals, biases and priorities. You may be a stocky computer programmer from Cleveland, but your ideal self is a bronzed volleyball player living on a Caribbean island. This is your ultimate goal, the person you want to be.

Then there’s the “ought” self, which is how you feel you should be behaving in order to achieve the ideal self. Your “ought” self is someone who avoids fatty foods and wasting money, learns volleyball and keeps an eye on Barbados property prices. Both selves provide motivation; the ideal self provides a positive kind of motivation, encouraging us to do things that bring us closer to our ideal. “Ought” self provides more negative, avoidance motivation, to keep us from doing things that take us away from our ideal; you want to order pizza for dinner? That’s not what you ought to do. Back to the salads for you.

Personality also plays a part. When it comes to motivation, someone’s locus of control can be crucial. This is the extent to which someone feels they are in control of events. They might be an egotistical sort who feels the very planet revolves around them, because why wouldn’t it? Or they may be far more passive, feeling they’re always at the mercy of circumstance. Such things may be cultural; people raised in a Western capitalist society, constantly told they can have anything they want, will feel more in control of their own lives, whereas someone living in a totalitarian regime probably won’t.

Feeling like a passive victim of events can be damaging; it can reduce the brain to a state of learned helplessness. People don’t feel they can change their situation, so lack the motivation to try. They don’t attempt to do anything as a result, and things get worse for them due to their inaction. This lowers their optimism and motivation further, so the cycle continues and they end up an ineffectual mess, paralyzed by pessimism and zero motivation. Anyone who’s ever been through a bad break-up can probably relate to this.

Exactly where motivation originates in the brain is unclear. The reward pathway in the midbrain is implicated, along with the amygdala due to the emotional component involved in things that motivate us. Connections to the frontal cortex and other executive areas are also associated as a lot of motivation is based on planning and anticipation of reward. Some even argue that there are two separate motivation systems, the advanced cognitive kind that gives us life goals and ambitions, and the more basic reactive kind that says, “Scary thing, run!” Or, “Look! Cake! Eat it!”

But the brain also has other quirks that produce motivation. In the 1920s, Russian psychologist Bluma Zeigarnik noticed, while sitting in a restaurant, that the waitstaff seemed to be able to remember only the orders they were in the process of dealing with.19 Once the order was completed, they seemed to lose all memory of it. This occurrence was later tested in the lab. Subjects were given simple tasks to do, but some were interrupted before they could complete them. Later assessment revealed that those who were interrupted could remember the tests much better, and even wanted to complete them despite the test being over and there being no reward for doing so.

This all gave rise to what is now known as the Zeigarnik effect, where the brain really doesn’t like things being incomplete. This explains why TV shows use cliff-hangers so often; the unresolved storyline compels people to tune in to the conclusion, just to end the uncertainty.

It seems as if the second best way to motivate a person to do something is to leave it incomplete and restrict their options for resolving it. There is an even more effective way to motivate people, but that will be revealed in my next book.

Is this meant to be funny?

Dean Burnett's books