The concept of “Moneyball” has almost become trite. The Undoing Project is a follow-up that seeks to correct some misconceptions about Moneyball and also tell the story of two psychologists who have interesting professional and personal stories. If the theme of Moneyball was that new-school data can beat old-school judgement, the theme of The Undoing Project is that data combined with good judgement can beat just data or just judgement.

= = = = = Quotes = = = = =

If the market for baseball players was inefficient, what market couldn’t be?

But the enthusiasm for replacing old-school expertise with new-school data analysis was often shallow.

a review by a pair of academics, then both at the University of Chicago—an economist named Richard Thaler and a law professor named Cass Sunstein. Thaler and Sunstein’s piece, which appeared on August 31, 2003, in the New Republic, managed to be at once both generous and damning. The reviewers agreed that it was interesting that any market for professional athletes might be so screwed-up that a poor team like the Oakland A’s could beat most rich teams simply by exploiting the inefficiencies. But—they went on to say—the author of Moneyball did not seem to realize the deeper reason for the inefficiencies in the market for baseball players: They sprang directly from the inner workings of the human mind. The ways in which some baseball expert might misjudge baseball players—the ways in which any expert’s judgments might be warped by the expert’s own mind—had been described, years ago, by a pair of Israeli psychologists, Daniel Kahneman and Amos Tversky.

When faced with uncertainty—about investments or people or anything else—how did it arrive at its conclusions?

And how did a pair of Israeli psychologists come to have so much to say about these matters that they more or less anticipated a book about American baseball written decades in the future? What possessed two guys in the Middle East to sit down and figure out what the mind was doing when it tried to judge a baseball player, or an investment, or a presidential candidate? And how on earth does a psychologist win a Nobel Prize in economics? In the answers to those questions, it emerged, there was another story to tell. Here it is.

But Daryl Morey believed—if he believed in anything—in taking a statistically based approach to decision making. And the most important decision he made was whom to allow onto his basketball team. “Your mind needs to be in a constant state of defense against all this crap that is trying to mislead you,” he said.

Daryl Morey had been the first of his kind: the basketball nerd king. His job was to replace one form of decision making, which relied upon the intuition of basketball experts, with another, which relied mainly on the analysis of data.

He’d always been just the way he was, a person who was happier counting than feeling his way through life.

From his stint as a consultant he learned something valuable, however. It seemed to him that a big part of a consultant’s job was to feign total certainty about uncertain things. In a job interview with McKinsey, they told him that he was not certain enough in his opinions. “And I said it was because I wasn’t certain. And they said, ‘We’re billing clients five hundred grand a year, so you have to be sure of what you are saying.’”

A lot of what people did and said when they “predicted” things, Morey now realized, was phony: pretending to know things rather than actually knowing things.

He suggested a new definition of the nerd: a person who knows his own mind well enough to mistrust it.

“Knowledge is literally prediction,” said Morey. “Knowledge is anything that increases your ability to predict the outcome. Literally everything you do you’re trying to predict the right thing. Most people just do it subconsciously.”

The trick wasn’t just to build a better model. It was to listen both to it and to the scouts at the same time. “You have to figure out what the model is good and bad at, and what humans are good and bad at,” said Morey.

“I made a new rule right then,” said Morey. “I banned nicknames.”

“Confirmation bias,” he’d heard this called. The human mind was just bad at seeing things it did not expect to see, and a bit too eager to see what it expected to see.

Morey’s solution was to forbid all intraracial comparison. “We’ve said, ‘If you want to compare this player to another player, you can only do it if they are a different race.’”

Maybe the mind’s best trick of all was to lead its owner to a feeling of certainty about inherently uncertain things.

Everyone had been warned; everyone’s minds remained screwed up. Simply knowing about a bias wasn’t sufficient to overcome it:

Morey thus became aware of what behavioral economists had labeled “the endowment effect.”

The central question posed by Gestalt psychologists was the question the behaviorists had elected to ignore: How does the brain create meaning?

Later, when he was a university professor, Danny would tell students, “When someone says something, don’t ask yourself if it is true. Ask what it might be true of.” That was his intellectual instinct, his natural first step to the mental hoop: to take whatever someone had just said to him and try not to tear it down but to make sense of it.

He compelled himself to be brave until bravery became a habit.

“The nice thing about things that are urgent,” he liked to say, “is that if you wait long enough they aren’t urgent anymore.”

“It’s hard to know how people select a course in life,” Amos said. “The big choices we make are practically random. The small choices probably tell us more about who we are. Which field we go into may depend on which high school teacher we happen to meet. Who we marry may depend on who happens to be around at the right time of life. On the other hand, the small decisions are very systematic.

By changing the context in which two things are compared, you submerge certain features and force others to the surface.

Things are grouped together for a reason, but, once they are grouped, their grouping causes them to seem more like each other than they otherwise would. That is, the mere act of classification reinforces stereotypes. If you want to weaken some stereotype, eliminate the classification.

Statistics wasn’t just boring numbers; it contained ideas that allowed you to glimpse deep truths about human life.

“Because we tend to reward others when they do well and punish them when they do badly, and because there is regression to the mean,” Danny later wrote, “it is part of the human condition that we are statistically punished for rewarding others and rewarded for punishing them.”

“Reforms always create winners and losers,” Danny explained, “and the losers will always fight harder than the winners.”

How did you get the losers to accept change? The prevailing strategy on the Israeli farms—which wasn’t working very well—was to bully or argue with the people who needed to change.

“It’s a key idea,” said Danny. “Making it easy to change.”

“The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information” was a paper, written by Harvard psychologist George Miller, which showed that people had the ability to hold in their short-term memory seven items, more or less. Any attempt to get them to hold more was futile. Miller half-jokingly suggested that the seven deadly sins, the seven seas, the seven days of the week, the seven primary colors, the seven wonders of the world, and several other famous sevens had their origins in this mental truth.

“Belief in the Law of Small Numbers” had raised an obvious next question: If people did not use statistical reasoning, even when faced with a problem that could be solved with statistical reasoning, what kind of reasoning did they use?

When people make judgments, they argued, they compare whatever they are judging to some model in their minds. How much do those clouds resemble my mental model of an approaching storm?

The more easily people can call some scenario to mind—the more available it is to them—the more probable they find it to be.

The unsuspecting Oregon students, having listened to a list, were then asked to judge if it contained the names of more men or more women. They almost always got it backward: If the list had more male names on it, but the women’s names were famous, they thought the list contained more female names, and vice versa.

“Images of the future are shaped by experience of the past,” they wrote, turning on its head Santayana’s famous lines about the importance of history: Those who cannot remember the past are condemned to repeat it. What people remember about the past, they suggested, is likely to warp their judgment of the future.

“We often decide that an outcome is extremely unlikely or impossible, because we are unable to imagine any chain of events that could cause it to occur. The defect, often, is in our imagination.”

Unless you are kicking yourself once a month for throwing something away, you are not throwing enough away, he said.

“It was a classic case of the representativeness heuristic,” said Redelmeier. “You need to be so careful when there is one simple diagnosis that instantly pops into your mind that beautifully explains everything all at once. That’s when you need to stop and check your thinking.”

Daniel Kahneman and Amos Tversky had pointed out that, while statistically sophisticated people might avoid the simple mistakes made by less savvy people, even the most sophisticated minds were prone to error. As they put it, “their intuitive judgments are liable to similar fallacies in more intricate and less transparent problems.”

When making judgments, people obviously could use help—say, by requiring all motorcyclists to wear helmets.

What is it with you freedom-loving Americans? he asked. Live free or die. I don’t get it. I say, “Regulate me gently. I’d rather live.” His fellow student replied, Not only do a lot of Americans not share your view; other physicians don’t share your view.

Many sentences popped out of Amos’s mouth that Redelmeier knew he would forever remember: A part of good science is to see what everyone else can see but think what no one else has ever said. The difference between being very smart and very foolish is often very small. So many problems occur when people fail to be obedient when they are supposed to be obedient, and fail to be creative when they are supposed to be creative. The secret to doing good research is always to be a little underemployed. You waste years by not being able to waste hours. It is sometimes easier to make the world a better place than to prove you have made the world a better place.

As Redelmeier put it, “Last impressions can be lasting impressions.”

“crucial decisions are made, today as thousands of years ago, in terms of the intuitive guesses and preferences of a few men in positions of authority.”

The job of the decision maker wasn’t to be right but to figure out the odds in any decision and play them well.

“We started with the idea that we should get rid of the usual intelligence report,” said Danny. “Intelligence reports are in the form of essays. And essays have the characteristic that they can be understood any way you damn well please.”

The field of decision making explored what people did after they had formed some judgment—after they knew the odds, or thought they knew the odds, or perhaps had judged the odds unknowable.

When they made decisions, people did not seek to maximize utility. They sought to minimize regret.

“Always keep one hand firmly on data,” Amos liked to say. Data was what set psychology apart from philosophy, and physics from metaphysics.

When you gave a person a choice between a gift of $500 and a 50-50 shot at winning $1,000, he picked the sure thing. Give that same person a choice between losing $500 for sure and a 50-50 risk of losing $1,000, and he took the bet.

For most people, the happiness involved in receiving a desirable object is smaller than the unhappiness involved in losing the same object.”

“Happy species endowed with infinite appreciation of pleasures and low sensitivity to pain would probably not survive the evolutionary battle,” they wrote.

They feared a one-in-a-billion chance of loss more than they should and attached more hope to a one-in-a-billion chance of gain than they should.

Their theory explained all sorts of things that expected utility failed to explain.

The reference point was a state of mind. Even in straight gambles you could shift a person’s reference point and make a loss seem like a gain, and vice versa. In so doing, you could manipulate the choices people made, simply by the way they were described.

People did not choose between things. They chose between descriptions of things.

Then I realized: They had one idea. Which was systematic bias.” If people could be systematically wrong, their mistakes couldn’t be ignored. The irrational behavior of the few would not be offset by the rational behavior of the many. People could be systematically wrong, and so markets could be systematically wrong, too.

The imagination obeyed rules: the rules of undoing. One rule was that the more items there were to undo in order to create some alternative reality, the less likely the mind was to undo them.

“He helped us to understand why pilots sometimes made bad decisions,” said Maher. “He said, ‘You’re not going to change people’s decision making under duress. You aren’t going to stop pilots from making these mental errors. You aren’t going to train the decision-making weaknesses out of the pilots.’” What Delta Air Lines should do, Amos suggested, was to change its decision-making environments.

People had trouble seeing when their minds were misleading them; on the other hand, they could sometimes see when other people’s minds were misleading them.

The way to stop the captain from landing the plane in the wrong airport, Amos insisted, was to train others in the cockpit to question his judgment.

That result was all that Danny and Amos needed to make their big point: The rules of thumb people used to evaluate probability led to misjudgments.

“Listen to evolutionary psychologists long enough,” Amos said, “and you’ll stop believing in evolution.”

“Prospect Theory,” scarcely cited in the first decade after its publication, would become, by 2010, the second most cited paper in all of economics.

In 2009, at the invitation of President Obama, Sunstein went to work at the White House. There he oversaw the Office of Information and Regulatory Affairs and made scores of small changes that had big effects on the daily lives of all Americans.

A part of good science is to see what everyone else can see but think what no one else has ever said. Amos had said that to him,

It is sometimes easier to make the world a better place than to prove you have made the world a better place. Amos had said that, too.

He said, ‘Life is a book. The fact that it was a short book doesn’t mean it wasn’t a good book. It was a very good book.’” Amos seemed to understand that an early death was the price of being a Spartan.