Passage 2
Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)
For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking.
While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents —reason was our Promethean gift—Kahneman and his scientific partner, the late Amos Tversky, demonstrated that we’re not nearly as rational as we like to believe.
When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.
A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actuallybe a subtle curse.
West and his colleagues began by giving four hundred and eighty-two undergraduates a questionnaire featuring a variety of classic bias problems. Here’s an example: In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?
Your first response is probably to take a shortcut, and to divide the final answer by half. That leads you to twenty-four days. But that’s wrong. The correct solution is forty-seven days.
But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”
The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task— “as it was before I made a study of these issues,” he writes.