This interesting and highly
recommended book about the spread and persistence of false beliefs covers a
subject that is both near and dear to my heart and something I write about
extensively. Most of the book does not go into the types of false beliefs with
political overtones that seem to infest our Facebook feeds, although the authors
allude to it, but focuses instead on how false beliefs about science come about
and then spread among scientists and the general public alike.
Of particular
interest to me is Chapter 3, The
Evangelization of People, which focuses on the way large commercial
interests have studied the behavior of scientists and have learned how to manipulate
them effectively for profit.
In recent years I have developed a
strong interest in “Groupthink” – the type of thinking in which people ignore
the evidence of their own eyes, ears, and logic in order to argue for something
that either justifies group norms or leads to an enhancement of a group’s political,
ideological, or financial interests or the interests of group leaders.
My
curiosity originally stems from my observations that self-destructive people
cannot possibly be hurting themselves out of “selfish” motives if they are not
mad, bad, or stupid. Instead, they are finding ways to sacrifice themselves for
the seeming good of their families.
The latter idea is not accepted by the
vast majority of practitioners and academics in the mental health and
biological evolution fields, where a wide variety of only partly-logical
theories have often taken root. My frustration in promoting my own idea led to
my becoming interested in the phenomenon of groupthink in science, which turned
out to be a far greater problem than I had originally thought. In fact, I am
currently co-editing a multi-authored book about groupthink in science, in which
practitioners from widely diverse academic disciplines focus on quite a few
different manifestations of it.
While groupthink in families and in
professional groups share a lot in common, there is one fundamental difference.
Scientists, while they may belong to some sort of profession fraternity, are usually
not related biologically. Nor are the scientists usually all members of the
same ethnic group or even the same nation. It is the biological connection
which makes the family system so powerful over its members that they are
willing to sacrifice themselves. Scientists are not under that sort of
pressure.
The book basically starts with a
discussion of how ideas generally spread among those scientists who are trying
to find truths but who are most often complete strangers to one another. Most
scientists communicate directly with just a few colleagues and find out what
the others are thinking by reading their articles in professional journals.
They only occasionally actually hear these others in person, usually at
professional conventions.
Although scientists can often be more easily swayed
by colleagues they know personally than by those they do not, a scientist’s overall
reputation in a particular field plays a big part. The behavior of people in “communication
networks” like these is examined, and it is this understanding that is being
used by commercial interests, sometimes for clearly nefarious purposes.
What’s amazing is that, although
companies can hire fraudulent scientists to consciously do studies that skew
the data in industry’s favor, they do not have to mess with the science that
way at all. In fact, they can skew the data without any direct input into any scientist’s work. One of the big ways is through selective reporting of real
data – they push the science that works in their favor with politicians and the
public while not mentioning the contradictory studies, even when the
contradictory studies vastly outnumber the ones in their favor.
An ingenious way to accomplish this is
through grant funding. Science is such that there will in most cases be a
variety of studies with different conclusions. Industry can pick those
scientists whose work is in their favor and provide them with grants and other funding. Independent grant funding, as from governments, is relatively more rare and hard to come by. Science can be very expensive to do. Therefore,
industry-funded studies become more likely to be completed and then published than those funded elsewhere,
making them appear more generalized than they actually are.
Then there is a
snowball effect where success breeds success. The more studies they publish,
the more likely the scientists will be to attract students who then follow in
their biased footsteps.
Scientists who later learn they have
been duped may be too embarrassed to publicly admit it.
An interesting side issue the authors bring
up concerns what does and does not get published. Different fields have
different thresholds for how much data a given study must contain in order to
meet their prevailing standard for publication. It is disconcerting but hardly
surprising that the authors single out neuroscience and psychology for having much
lower standards than other fields. Readers who follow my blog will have many
examples of this from which to choose.
This topic seems related to the persuasion topic that Scott Adams (Dilbert) has been discussing regarding (fake) news, TDS and climate science.
ReplyDeleteI am pretty sure Scott would be quite interested in your ideas on this subject which could bring a wider audience to your relevant and important work.
Another individual who I think may be interested is Dr. Gad Saad,an evolutionary behavioral scientist at the John Molson School of Business at Concordia University.
Finally, I have wondered if anyone is looking at this topic in relation to religious or political cults (ISIS, Scientology, AntiFa or White Nationalist)? Can the same techniques used in treating dysfunctional families work with cult members? Are these just larger, more dangerous, dysfunctional family groups?
Hi Holly,
ReplyDeleteThanks for the info.
I haven't researched the social psychology literature on cults, so I don't know what's out there, but I'd bet there is some research into this.
Cult members would have to come in for treatment for systems therapy to work, which would be highly unlikely (Scientology!). More importantly, I believe that an individual not born into the cult would only be vulnerable to joining a cult - a highly self-destructive behavior - because doing so stabilizes the person's parents in some way, despite the parents' protestations to the contrary (actions speak louder than words).
Just to be clear, my idea about family dynamics and cult-joining is not based on any data, since as I said cult members generally don't come for therapy - but is a prediction based on my theoretical perspective.
DeleteInteresting thought regarding the stabilization of the parental figures. I would've thought it was more a transference from a dysfunctional or controlling parental figure. But in reality there are probably innumerable reasons that drive cult participation.
ReplyDeleteOn a different note, slightly off topic just read an article on Faking It On Climate Change by Bjørn Lomborg. Probably a classic example of the misinformation as described by O'Connor and Weatherall.
I view transference (somewhat differently than the analysts) as the effects of the dysfunctional role adult children take on for their parents (their persona or false self) on their relationships with other intimates. Projective identification then induces others (especially lovers and therapists) to help them maintain their false self (enabling).
Delete