JL

The illusion of validity

When confidence trumps evidence

Daniel Kahneman, emeritus professor of psychology and of public affairs at Princeton University and a winner of the 2002 Noble Prize in Economics has written a fascinating article for the New York Times.*

Kahneman describes a story from early in his career when he was part of a team evaluating candidates for officer training in the Israeli army. His team were in the fortunate position that they received regular feedback on the actual performance of the soldiers. As a result he came to these conclusions (in his own words):

Our ability to predict performance at the school was negligible.

Our certainty about the potential of individual candidates [was] largely useless.

Other events [not present during the assessment] — some of them random — would determine later success in training and combat.

We had made up a story from the little we knew but had no way to allow for what we did not know about the individual’s future, which was almost everything that would actually matter.

The statistical evidence of our failure should have shaken our confidence in our judgments of particular candidates, but it did not. It should also have caused us to moderate our predictions, but it did not. We knew as a general fact that our predictions were little better than random guesses, but we continued to feel and act as if each particular prediction was valid.

Kahneman called this phenomenon “the illusion of validity”. Later research confirmed his observations and revealed other “cognitive fallacies”:

People who face a difficult question often answer an easier one instead, without realizing it.

The exaggerated expectation of consistency is a common error. We are prone to think that the world is more regular and predictable than it really is, because our memory automatically and continuously maintains a story about what is going on, and because the rules of memory tend to make that story as coherent as possible and to suppress alternatives.

People come up with coherent stories and confident predictions even when they know little or nothing. Overconfidence arises because people are often blind to their own blindness.

Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence for the story is sparse and unreliable. The bias toward coherence favors overconfidence. An individual who expresses high confidence probably has a good story, which may or may not be true.

When a compelling impression of a particular event clashes with general knowledge, the impression commonly prevails

Facts that challenge basic assumptions are simply not absorbed. The mind does not digest them. This is particularly true of statistical studies of performance, which provide general facts that people will ignore if they conflict with their personal experience.

This last finding shows why there is often little point in wasting your breath giving advice. Either it will not challenge a basic assumption and the person could probably have come up with the idea themselves (and probably did). Or your advice will challenge a basic assumption and it will therefore be ignored.

The research of Kahneman and others confirms Penny Tompkins and my observations. Our ability is to self-deceive is universal and has its own built-in preservation mechanism (see our article, Self-Deception, Self- Delusion and Self-Denial). One of the drivers of this ability is the impulse to reduce the difficult feelings that arise when events and evidence are incompatible with our ideas and beliefs (see our article, Cognitive Dissonance and Creative Tension — the same or different?). 

In both cases explanations come to our rescue.  Explanations are both pervasive and seductive. So much so that many people cannot live without them (notice what happens, both to them and to you, when you ask someone to do something without giving any explanation). Because we get caught up in the content of explanations we fail to notice just how much time we devote to giving, seeking or listening to them – most of which are little more than stories we are somehow able to believe (see our article Becausation).

Finance and investment

Nowhere are these cognitive fallacies more evident than in the field of finance and investment. I was first made aware of this by Nassim Nicholas Taleb and his masterful exposure of the inherently unpredictable nature of economics. (See my summary of Taleb’s brilliant analysis Black Swan Logic: Thinking outside the Norm).

Kahneman cites research, based on the trading records of 10,000 brokerage accounts of individual investors over a seven-year period, that shows:

The results were unequivocally bad. On average, the shares investors sold did better than those they bought, by a very substantial margin: 3.3 percentage points per year, [and that’s not including trading fees].

The most active traders had the poorest results, while those who traded the least earned the highest returns.

Men act on their useless ideas significantly more often than women do, and that as a result women achieve better investment results than men.

OK, so if individual trading sucks, obviously we need ‘expert’ advice. Except the fact is: “At least two out of every three mutual funds underperform the overall market in any given year.” An achievement that a dart-throwing chimp could not match. And, a study of wealth advisers’ performance year-on-year for eight consecutive years showed zero consistency. The analysis strongly suggests that skill was not involved in their successes. Kahneman remarks that the results were more reminiscent of a game of dice, than a game requiring skill.

You will probably not be surprised to know that when Kahneman reported back to the company that employed the wealth advisers, the “findings and their implications were quickly swept under the rug and that life in the firm went on just as before”.

True intuitive expertise

Kahneman confirms that “True intuitive expertise is learned from prolonged experience with good feedback on mistakes”. Therefore, he recommends asking two questions before you trust a particular intuitive judgment:

Is the environment in which the judgment is made sufficiently regular to enable predictions from the available evidence? The answer is yes for [medical] diagnosticians, no for stock pickers.

Do the professionals have an adequate opportunity to learn the cues and the regularities? The answer here depends on the professionals’ experience and on the quality and speed with which they discover their mistakes. Anesthesiologists have a better chance to develop intuitions than radiologists do.

A British TV program where three experts had to identify people who had been diagnosed with a severe mental disturbance from a group of ‘normal’ people spectacularly failed to do so even when the experts were able to devise a series of activities designed to reveal the participants frailties [I’m not making this up!]. This suggests that psychological diagnosis is about as accurate as financial forecasting.

After decades of research Kahneman concludes that, “in general, you should not take assertive and confident people at their own evaluation unless you have independent reason to believe that they know what they are talking about. Unfortunately, this advice is difficult to follow: overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.”

And Kahneman wirily notes that we are all prone to this way of thinking and hence even after reading his paper he predicts “The confidence you will experience in your future judgments will not be diminished by what you just read, even if you believe every word.”

Confidence and congruence gone awry

I’ll end this blog by linking it to my research into ‘how facilitators know if what they are doing is working for the client – and more importantly, if it is not. (For the background to this topic see our article, Calibration.)

The research of Kahneman and others should raise some warning signals about facilitators of all persuasions – but especially those who base their behaviour on being confident and congruent. In the 1990s I attended a Design Human Engineering training with Richard Bandler, co-originator of NLP, where he suggested that NLP practitioners should imagine themselves as a ten-foot panther with the internal dialogue about their clients: “Your ass is mine”.

Kahneman’s research would suggest that people who took Bandler’s advice would be highly susceptible to “the illusion of validity” and so would their clients. And that is the whole point. Persuasion partially works because of the confidence and congruence the practitioner projects, but rarely if ever are the potential downsides of the approach discussed. In a way they can’t be, because this would undermine the whole philosophy. And we have seen what happens to challenges to basic assumptions – they are often ignored.

Bandler might be a one-off, but he is certainly not alone. Here is the posting of an experienced hypnotherapist on a public forum:

  • Hypnotherapy Works !
  • Everybody that comes to see me … goes away feeling better.
  • Doesn’t everybody get that result ?!!!!
  • Any talk of FAILURE I’m sorry, is alien to me!

Well, I’m sorry too … for this therapist’s clients. How many clients are going to tell him that what he did didn’t work for them? What are the chances of a client telling him during a session that what he is doing isn’t working? And if they did, what are the chances of him listening?

Fortunately, some trainers recognise that the importance of tethering confidence to competence. In 1995 Penny Tompkins and I were asked by Cricket Kemp to be assessors on an NLP Practitioner certification for NLP North East. We were well impressed when we saw that one of Cricket’s assessment criteria given to students was: “Balance your level of confidence with your level of competence”. I don’t know whether Cricket was aware ofn Kahneman’s research but she certainly understood the dangers of over- and under-confidence when compared to demonstrable performance.

Unfortunately, it is the unknowingly overconfident professionals that “sincerely believe they have expertise, act as experts and look like experts” that are most likely to “be in the grip of an illusion”; and not only not know it, but actively deny it, telling plausible stories and quoting believable statistics – anything but challenge their basic assumption.

Of course, I’m not sure about this.

* ‘Don’t Blink! The Hazards of Confidence by Daniel Kahneman’, New York Times, October 19, 2011. nytimes.com/2011/10/23/magazine/dont-blink-the-hazards-of-confidence.html – The article is adapted from his new book Thinking, Fast and Slow.

body * { color: inherit !important; }