And that was not all the evidence at Zuckerberg’s disposal. Even the viral popularity of the anti–News Feed group was evidence of the power of News Feed. The group was able to grow so rapidly precisely because so many people had heard that their friends had joined—and they learned this through their News Feed.
In other words, while people were joining in a big public uproar over how unhappy they were about seeing all the details of their friends’ lives on Facebook, they were coming back to Facebook to see all the details of their friends’ lives. News Feed stayed. Facebook now has more than one billion daily active users.
In his book Zero to One, Peter Thiel, an early investor in Facebook, says that great businesses are built on secrets, either secrets about nature or secrets about people. Jeff Seder, as discussed in Chapter 3, found the natural secret that left ventricle size predicted horse performance. Google found the natural secret of how powerful the information in links can be.
Thiel defines “secrets about people” as “things that people don’t know about themselves or things they hide because they don’t want others to know.” These kinds of businesses, in other words, are built on people’s lies.
You could argue that all of Facebook is founded on an unpleasant secret about people that Zuckerberg learned while at Harvard. Zuckerberg, early in his sophomore year, created a website for his fellow students called Facemash. Modeled on a site called “Am I Hot or Not?,” Facemash would present pictures of two Harvard students and then have other students judge who was better looking.
The sophomore’s site was greeted with outrage. The Harvard Crimson, in an editorial, accused young Zuckerberg of “catering to the worst side” of people. Hispanic and African-American groups accused him of sexism and racism. Yet, before Harvard administrators shut down Zuckerberg’s internet access—just a few hours after the site was founded—450 people had viewed the site and voted 22,000 times on different images. Zuckerberg had learned an important secret: people can claim they’re furious, they can decry something as distasteful, and yet they’ll still click.
And he learned one more thing: for all their professions of seriousness, responsibility, and respect for others’ privacy, people, even Harvard students, had a great interest in evaluating people’s looks. The views and votes told him that. And later—since Facemash proved too controversial—he took this knowledge of just how interested people could be in superficial facts about others they sort of knew and harnessed it into the most successful company of his generation.
Netflix learned a similar lesson early on in its life cycle: don’t trust what people tell you; trust what they do.
Originally, the company allowed users to create a queue of movies they wanted to watch in the future but didn’t have time for at the moment. This way, when they had more time, Netflix could remind them of those movies.
However, Netflix noticed something odd in the data. Users were filling their queues with plenty of movies. But days later, when they were reminded of the movies on the queue, they rarely clicked.
What was the problem? Ask users what movies they plan to watch in a few days, and they will fill the queue with aspirational, highbrow films, such as black-and-white World War II documentaries or serious foreign films. A few days later, however, they will want to watch the same movies they usually want to watch: lowbrow comedies or romance films. People were consistently lying to themselves.
Faced with this disparity, Netflix stopped asking people to tell them what they wanted to see in the future and started building a model based on millions of clicks and views from similar customers. The company began greeting its users with suggested lists of films based not on what they claimed to like but on what the data said they were likely to view. The result: customers visited Netflix more frequently and watched more movies.
“The algorithms know you better than you know yourself,” says Xavier Amatriain, a former data scientist at Netflix.
CAN WE HANDLE THE TRUTH?
You may find parts of this chapter depressing. Digital truth serum has revealed an abiding interest in judging people based on their looks; the continued existence of millions of closeted gay men; a meaningful percentage of women fantasizing about rape; widespread animus against African-Americans; a hidden child abuse and self-induced abortion crisis; and an outbreak of violent Islamophobic rage that only got worse when the president appealed for tolerance. Not exactly cheery stuff. Often, after I give a talk on my research, people come up to me and say, “Seth, it’s all very interesting. But it’s so depressing.”
I can’t pretend there isn’t a darkness in some of this data. If people consistently tell us what they think we want to hear, we will generally be told things that are more comforting than the truth. Digital truth serum, on average, will show us that the world is worse than we have thought.
Do we need to know this? Learning about Google searches, porn data, and who clicks on what might not make you think, “This is great. We can understand who we really are.” You might instead think, “This is horrible. We can understand who we really are.”
But the truth helps—and not just for Mark Zuckerberg or others looking to attract clicks or customers. There are at least three ways that this knowledge can improve our lives.
First, there can be comfort in knowing that you are not alone in your insecurities and embarrassing behavior. It can be nice to know others are insecure about their bodies. It is probably nice for many people—particularly those who aren’t having much sex—to know the whole world isn’t fornicating like rabbits. And it may be valuable for a high school boy in Mississippi with a crush on the quarterback to know that, despite the low numbers of openly gay men around him, plenty of others feel the same kinds of attraction.
There’s another area—one I haven’t yet discussed—where Google searches can help show you are not alone. When you were young, a teacher may have told you that, if you have a question, you should raise your hand and ask it because if you’re confused, others are, too. If you were anything like me, you ignored your teacher’s advice and sat there silently, afraid to open your mouth. Your questions were too dumb, you thought; everyone else’s were more profound. The anonymous, aggregate Google data can tell us once and for all how right our teachers were. Plenty of basic, sub-profound questions lurk in other minds, too.
Consider the top questions Americans had during Obama’s 2014 State of the Union speech. (See the color photo at end of the book.)
YOU’RE NOT THE ONLY ONE WONDERING: TOP GOOGLED QUESTIONS DURING THE STATE OF THE UNION
How old is Obama?
Who is sitting next to Biden?
Why is Boehner wearing a green tie?
Why is Boehner orange?