Pages

Tuesday, July 21, 2015

Groupthink: How Even Scientists Con Themselves in Order to Fit In


australianclimatemadness.com


In his brilliant book, The Righteous Mind, Jonathan Haidt argues convincingly that logic evolved in humans not to establish the truth about the world or to establish facts, but to argue for ideas that benefit the kin and ethnic groups to which we belong, as well as to maintain a good reputation within those groups. My colleague Gregg Henriques calls this the Justification Hypothesis: logic is used to justify our group norms.

Many of our beliefs are based not on facts or reason at all and in fact seem to be impervious to them. They are instead based upon either our groupishness (the opposite of selfishness). For almost all of us, it is generally more important for us to look right than to be right.

This type of reasoning appears at the level of the individual, where it is called defense mechanisms and irrational beliefs. It appears at the level of the family or kin group, where it is called family myths. It also exists at the level of cultural groups, where it is called theology. Or if it is not your particular brand of theology, then it is called mythology.

Another name for this phenomenon in general is groupthink. We cede the right to think for ourselves for the sake of our group, and we often try to discourage our intimates from thinking for themselves for the same reason.

Even scientists are not immune. So what are some of the mechanisms by which they do this to themselves and to other people? That is the topic of this post.

First, a brief review of some previous posts. As I described in one post, I realized a long time ago that the so-called defense mechanisms discussed by the Freudians and the irrational thoughts catalogued by cognitive therapists (CBT) had a purpose that was not only intra-psychic but interpersonal as well. 

Defense mechanisms are defined as mental processes initiated, typically subconsciously, to avoid ideas or impulses that are unacceptable to our value system, and to avoid anxiety. Another name for this is mortification. We may, for example, compulsively try to act in the opposite way that the unacceptable impulse would dictate (reaction formation), or displace our anger from one person onto another, safer one.

Their interpersonal purpose is to screen out beliefs and impulses that are threatening to the kin group, which is also why they are threatening to the individual within the kin group.

The irrational thoughts of CBT, which they attributed to humans being basically irrational, also functioned much like the defense mechanisms. If you, for example, "catastrophize" about what might happen if you indulged an impulse that your kin group does not approve of (by, say imagining the worst possible outcome of doing so), you will indeed scare yourself away from engaging in it. (Of course, the CBT folks reject the whole concept of defense mechanisms - I recall a somewhat sarcastic reply from cognitive therapy pioneer Albert Ellis when I brought this up at one of his talks).

On a related note, there are the logical fallacies that are enumerated by logicians and which are well known to members of college debate squads. An example is post hoc reasoning, which assumes wrongly that if event A is quickly followed by event B, then it is true that A caused B. I saw patients engage in many of these fallacies when confronted with the negative consequences of the behavior that seemed to be demanded of them by their families. So, I believe, the logical fallacies can also be used as defense mechanisms - specifically designed to avoid troublesome questions about cherished beliefs that on the surface are simplistic at best and preposterous at worst.

As mentioned above, scientists are not immune from groupthink and groupishness. In fact, they are as nearly as likely as anyone else to employ them. I witnessed many times in scientific debates how the debaters would subtly employ various techniques and mind tricks to silence critics of their studies or ideas.

A few examples among many:

a) Black-and-white, or all-or-none thinking. Biological psychiatrists seem to think that everything in the DSM diagnostic manual is a brain disease, whereas the anti-psychiatry folks believe that nothing is, and the listed diagnoses are all just alternate lifestyles, different ways of looking at the world, or reactions to trauma.

b) Arguments that advance the idea that, because many parts of the thinking of someone like, for instance, Freud, were totally off-base (like "penis envy" and his theories about homosexuality), that therefore ALL of his ideas were wrong (including such obviously real things as intra-psychic conflict and defense mechanisms).

c) Stating facts about the results of studies without describing certain contextual elements that put those facts in a different light. A great example I have already blogged about is how the leader of the National Institute on Drug Abuse spoke about experiments with monkeys showing them pulling a level to get cocaine until they died - while neglecting to mention that the animals were in solitary confinement with nothing else to do. When that was not the case, they behaved very differently.

d) Conflating the issue of how a phenomenon arises or what it means in the scheme of things with the issue of whether the phenomenon even exists at all. For example, CBT'ers would deny that the concept of resistance, a psychoanalytic idea that states that people are often highly invested in their psychological symptoms and resist change - is a real phenomenon. All the while, they failed to report in their case studies the high level of non-compliance with CBT homework assignments by their patients in treatment.

e) Grossly exaggerating the strength of certain research findings while completely ignoring the study's weaknesses and problematic assumptions.

f) Conflating another scientist's conclusion about the significance of a clinical anecdote with the description of the anecdote, and not considering what else the anecdote might mean.

g) Scientism: the idea that randomized placebo-controlled studies of something are the end-all and be-all of science, and that everything else is just anecdotal and not science at all. I answer those who make this argument by asking for volunteers for a randomized placebo-controlled study on whether parachutes reduce the incidence of deaths and injuries during falls from airplane flights. 

I also point out that scientism creates a problem when it is only a slight exaggeration to say that in order to study an important psychological phenomenon like self-deception with a large enough study sample and within a reasonable time frame, you would pretty much have to ask people about their opinion of themselves. Sorta defeats the goal of the study, doesn't it? So does this mean that therefore studying self-deception should be completely off limits to scientists? 


Scientists will often accuse other scientists of doing these things while doing them themselves. This is projection - another defense mechanism. These mental mechanisms are so pervasive in human beings that we are quite likely to find at least some of them in any scientific discussion. 

Discerning readers will no doubt find examples in which I do some of it in my posts on my blogs. 

2 comments:

  1. Such informative post here :)
    Keep sharing good article!

    ReplyDelete
  2. I worked at a once famous (actually it still is) company that completely imploded into bankruptcy during the 90s and 00s tech boom. Everything you describe here I can pick out an example of. Employees that stayed with the company rode it all the way down. It was the exception for individuals to honestly think there was something wrong.

    ReplyDelete