Wednesday, February 20, 2019

Book Review: The Misinformation Age by Cailin O’Conner and James Owen Weatherall

This interesting and highly recommended book about the spread and persistence of false beliefs covers a subject that is both near and dear to my heart and something I write about extensively. Most of the book does not go into the types of false beliefs with political overtones that seem to infest our Facebook feeds, although the authors allude to it, but focuses instead on how false beliefs about science come about and then spread among scientists and the general public alike. 

Of particular interest to me is Chapter 3, The Evangelization of People, which focuses on the way large commercial interests have studied the behavior of scientists and have learned how to manipulate them effectively for profit.

In recent years I have developed a strong interest in “Groupthink” – the type of thinking in which people ignore the evidence of their own eyes, ears, and logic in order to argue for something that either justifies group norms or leads to an enhancement of a group’s political, ideological, or financial interests or the interests of group leaders. 

My curiosity originally stems from my observations that self-destructive people cannot possibly be hurting themselves out of “selfish” motives if they are not mad, bad, or stupid. Instead, they are finding ways to sacrifice themselves for the seeming good of their families.

The latter idea is not accepted by the vast majority of practitioners and academics in the mental health and biological evolution fields, where a wide variety of only partly-logical theories have often taken root. My frustration in promoting my own idea led to my becoming interested in the phenomenon of groupthink in science, which turned out to be a far greater problem than I had originally thought. In fact, I am currently co-editing a multi-authored book about groupthink in science, in which practitioners from widely diverse academic disciplines focus on quite a few different manifestations of it.

While groupthink in families and in professional groups share a lot in common, there is one fundamental difference. Scientists, while they may belong to some sort of profession fraternity, are usually not related biologically. Nor are the scientists usually all members of the same ethnic group or even the same nation. It is the biological connection which makes the family system so powerful over its members that they are willing to sacrifice themselves. Scientists are not under that sort of pressure.

The book basically starts with a discussion of how ideas generally spread among those scientists who are trying to find truths but who are most often complete strangers to one another. Most scientists communicate directly with just a few colleagues and find out what the others are thinking by reading their articles in professional journals. They only occasionally actually hear these others in person, usually at professional conventions. 

Although scientists can often be more easily swayed by colleagues they know personally than by those they do not, a scientist’s overall reputation in a particular field plays a big part. The behavior of people in “communication networks” like these is examined, and it is this understanding that is being used by commercial interests, sometimes for clearly nefarious purposes.

What’s amazing is that, although companies can hire fraudulent scientists to consciously do studies that skew the data in industry’s favor, they do not have to mess with the science that way at all. In fact, they can skew the data without any direct input into any scientist’s work. One of the big ways is through selective reporting of real data – they push the science that works in their favor with politicians and the public while not mentioning the contradictory studies, even when the contradictory studies vastly outnumber the ones in their favor.

An ingenious way to accomplish this is through grant funding. Science is such that there will in most cases be a variety of studies with different conclusions. Industry can pick those scientists whose work is in their favor and provide them with grants and other funding. Independent grant funding, as from governments, is relatively more rare and hard to come by. Science can be very expensive to do. Therefore, industry-funded studies become more likely to be completed and then published than those funded elsewhere, making them appear more generalized than they actually are. 

Then there is a snowball effect where success breeds success. The more studies they publish, the more likely the scientists will be to attract students who then follow in their biased footsteps.

Scientists who later learn they have been duped may be too embarrassed to publicly admit it.

An interesting side issue the authors bring up concerns what does and does not get published. Different fields have different thresholds for how much data a given study must contain in order to meet their prevailing standard for publication. It is disconcerting but hardly surprising that the authors single out neuroscience and psychology for having much lower standards than other fields. Readers who follow my blog will have many examples of this from which to choose.