11.04.2011

Knowing Is Not Enough


To get to the point, Daniel Kahneman (winner of the 2002 Nobel Prize in Economics) makes a striking claim in writing about his new book, Thinking, Fast and Slow. In spite of years of study and important research on shortcomings in human reasoning, he confesses that he still subject to them.

He is fully aware of the biases and inferential errors that we all make in evaluating the information we have in an uncertain situation and yet he continues to make such errors.

He admits he is simply unable to do anything about it.

“My intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues.”

In writing about his early work on predicting the future leadership ability of individual Israeli Army soldiers (New York Times October 10, 2011) he concludes:

“I thought that what was happening to us was remarkable. The statistical evidence of our failure should have shaken our confidence in our judgment of particular candidates, but it did not. We knew as a general fact that our predictions were little better than random guesses, but we continued to feel and act as if each particular prediction was valid. I was reminded of visual illusions, which remain compelling even when you know that what you see is false.”

Can knowledge or self-awareness of our judgment biases help to avoid them? Like most everyone else, Kahneman hoped that his research findings would contribute to that end. But if even he admits they haven’t, how can the rest of us do any better?

In his post about Kahneman’s book on the New Yorker Book Bench (October 25, 2011), Jonah Lehrer concludes:

“But his greatest legacy, perhaps, is also his bleakest: By categorizing our cognitive flaws, documenting not just our error, but also their embarrassing predictability, he has revealed the hollowness of a very ancient aspiration. Knowing thyself is not enough. Not even close.”

This is bleak, isn’t it? And yet, can it be true? Must it be true? I am more optimistic than Lehrer or even Kahneman. The question is clear—we need to learn how to make our knowledge of mental flaws more salient in situations where it might prove useful.

This often occurs naturally, when, for example, newly acquired information is still readily available. However, as the information is gradually forgotten with the passage of time, we need to be reminded of its relevance by a conspicuous signal or prompt to ourselves. Until we figure out how to do this more reliably, we must be careful not to overestimate the extent to which knowing about our biases influences our reasoning.