Pages

Friday, December 4, 2015

Book Review: Willful Blindness: Why We Ignore the Obvious at Our Peril by Margaret Hoffernan




In the Hans Christian Anderson fairy tale, The Emperor's New Clothes,  two con men promise an emperor a new suit of clothes that, they said, was invisible to those who were foolish or stupid. When the Emperor parades before his subjects in his new "clothes," no one dares to say that he is actually naked until a child cries out, "But he isn't wearing anything at all!"

Refusal to see the obvious is a characteristic of groupthink, which is a major theme of this blog and a prime concern in my psychotherapy model of unified therapy. It is part and parcel of something that Gregg Henriques calls the justification hypothesis: that we use reason to justify ideas that cement our position within a group, rather than to arrive at the truth. This was also the main theme of Jonathan Haidt's book, The Righteous Mind, which was previously reviewed here.

A wife ignores obvious evidence that her husband is having an affair. Religious people claim to absolutely believe the most preposterous ideas in order to fit in with their fellow church members. Political ideologues seem immune to certain facts no matter how much the evidence mounts, and make convoluted arguments that they seem to believe prove that their ideas are correct. Children in dysfunctional families act out family rules over and over again no matter how much pain it causes them.

In a fascinating book, Margaret Hoffernan dissects this aspect of groupthink and elaborates on all of its myriad manifestations. While concentrating mostly on the madness of employees in business organizations walking together off a cliff in maddening lock step - with the result that the organization is eventually harmed or destroyed (such as Enron, BP  and the derivatives crisis in investment banking that led to the economic meltdown a few short years ago) - she also gives examples from many other walks of life.

Go along to get along. Be silent in order to avoid conflict because conflict might destroy a family or an organization. Rationalize your misbehavior because "everybody's doing it." Once you've laid out your position, never change it in light of new information because you might appear weak. Be a good "team player" and do not ask any hard questions. Avoid changes you might have to think hard about, because it takes a lot less energy and brain power to believe than to doubt.

Then there is something she calls the Bystander Effect: Don't intervene in a crisis if there are lots of people around - surely someone else will take care of it. The more people around to witness a crime, for example, the less likely it becomes that anyone will call 911.

Powerful people are often the most seriously prone to conform to what the author refers to as received wisdom, or information that conforms to stereotypes.

In another fascinating chapter, the author describes people whom she calls Cassandras. These are the folks who refuse to accept ideas just to fit in, often at great personal sacrifice, and are willing to look at the bigger picture to see potential problems that others blithely ignore. The author writes, "...After every institutional or organizational failure, individuals invariably surface who saw the crisis coming, warned about it, and were mocked or ignored."(p. 201). Whistleblowers are examples of such people; the government tries but usually fails to protect them.

The oddest thing about being willfully blind is the fact that people who do this must be aware at some level of exactly what they are doing. As the author states, "How could we know where not to look without looking first?" (p. 88). In order to lie to ourselves, we have to ignore evidence that repeatedly hits us over the head like a two-by-four. Knowledge of a fact can be inferred when someone deliberately blinds himself or herself to its existence.

As the author mentions, we all use two types of thinking: one is automatic, born of habit, fast, and intuitive. If we had to reason out every move we made, we'd be paralyzed. The second form is the one in which we deliberate, weigh pro's and cons, examine evidence, and such. It involves much more energy and takes way more time. A key point about it is that it is also used to monitor the first type of thinking for errors. No one would survive very long without possessing capabilities for both of these types of thinking. Monitoring for errors is always lurking in the background of our minds, even when we are reacting subconsciously, automatically, and without thinking.

Two recent movies, based on actual historical events, brilliantly depict illustrative, startling, and dramatic examples of large numbers of people engaging in willful blindness - and the Cassandras who finally changed things. Labyrinth of Lies shows Germans in the 1950's acting like the Holocaust never happened. Spotlight shows just how many people knew or should have known about Catholic clergy abusing children sexually, and also shows that the higher-ups kept moving pedophile priests from parish to parish - and for a long time no one did much of anything about it.

Facing problems is almost always better than sweeping them under the carpet for the purpose of furthering group harmony. As long as a problem is invisible, it will remain unsolved. As James Baldwin once said, "Not everything that is faced can be changed, but nothing can be changed until it is faced."

No comments:

Post a Comment