There is no harm in being sometimes wrong - especially if one is promptly found out. John Maynard Keynes.
I belong to an organization in which almost every member (particularly the most senior members) are what I would call "no people." These are people who, regardless of the veracity of comments or suggestions, automatically say "no." The no may be qualified by some patently absurd rationalization or it may be just a "no." What is constant across all dimensions is the answer, which is "no."
It took me a long time to make sense of this behavior as frequently there were manifestly obvious signs that poor decisions had been made or events, people and circumstances had been wrongly judged. In other words, there was frequent and strong evidence that the thing they were saying "no" to was correct while their seemingly indefensible option was incorrect.
One day, I was reading a book about why expert predictions fail (a circumstance not dissimilar to the operation of the organization of which I am a member) when I read about cognitive dissonance and suddenly everything made sense.
In 2002, Montier described cognitive dissonance as "the mental conflict that people experience when they are presented with evidence that their beliefs or assumptions are wrong." In order to relieve this conflict people have two options: (a) change their belief; or (b) somehow rationalize away the evidence to resolve the conflict. Most people, particularly those who are deeply invested in their beliefs, will employ option (b).
The organization in question is full of people who perceive themselves to be "expert" in their field and, their "expert" status is very important to their self-esteem and self-image. Therefore, when presented with evidence that indicates they are not expert, the ensuing cognitive dissonance drives them to rationalize away the evidence or simply to ignore it all together.
Understanding why people do something can certainly make dealing with the behavior easier, but I admit to a certain rising level of frustration as time after time their "expert" status is called into question and, instead of learning from their mistakes and actually moving closer towards expert status, all evidence to the contrary is swept under the rug or blithely rationalized away, and opportunities for growth and learning are lost.
One occasion that sticks in my mind as being descriptive of the phenomenon occurred shortly after our organization had made what I consider a serious mistake by deciding to ski, all bunched up together (except for me as I whipped through the area as fast as I could), through a terrain trap during a snow storm in the middle of the night when stability was poor. After the fact, I said that we had all made a bad decision and should be careful not to make such poor decisions in the future. No-one would admit that the decision was bad, and the rationalization I got as to why the decision was reasonable was that any avalanche that occurred would be "only" size two. When I pointed out that a size two avalanche by definition could "bury, injure, or kill a person" my comments were quickly brushed aside and forthwith ignored.
Natural selection in progress you might think. With decisions such as I've related commonplace, extinction might not be far off. I'm just worried they'll take me with them.
If you really need to dig a pit to see what stability is like, you might want to rethink something
No comments:
Post a Comment