Scientific Conspiracy Theories: A Veneer for Irrational Beliefs
This essay was originally written for Q Ideas, as part of a series connected to the new book Veneer: Living Deeply in a Surface Society by Tim Willard and Jason Locy. They asked me to consider how the idea of “veneer” played out in the world of science.
My grandfather, a North Florida farmer, never believed that men had landed on the moon. He considered it an outrageous claim, plainly contradicted by common sense, and suspected that the whole project was an elaborate fiction to raise federal taxes. He held these beliefs privately for the most part, and never had the Internet to connect him with other similar-minded people. He held to a conspiracy theory, but it was fairly benign (believing it didn’t harm anyone), and it was not fed by an echo chamber of paranoid websites. He didn’t try to justify his belief with spurious evidence or distorted science. He knew what he knew.
In recent times, conspiracy theories have grown less benign, more prone to amplification, and more prone to take on the veneer of scientific respectability. Two cases illustrate the point:
CASE 1: Claims about the link between childhood vaccines and autism have circulated for some time, yet scientists have been unable to detect such a connection. In January of this year, a controversial study that had claimed to find a tentative link between childhood vaccines and autism, and which had given rise to conspiracy theories about the medical establishment, was shown to be the result of outright fraud and falsified data. Vaccines were (again) shown to be safe enough to warrant widespread use in immunizing children against childhood diseases, and highly-publicized claims that components of those vaccines cause autism in young children were conclusively debunked.
The discovery of fraud did not quell the fear-mongering of activist groups of parents, many of whom still refuse to get their youngsters vaccinated. That decision puts their own children, and other children, at increased risk of death from preventable diseases.
CASE 2: At the end of 1996, scientists were so certain that the HIV virus caused the condition called AIDS that they began giving patients anti-retroviral therapy (ART) intended to keep the HIV virus from replicating. The result came to be known as the “Lazarus effect”, as AIDS patients at death’s door began to come forth, and to go back to their jobs.
That didn’t convince South African President Thabo Mbeki, who refused to believe in the HIV/AIDS connection (he believed the science to reflect poorly on African morality and values). In the year 2000, his government invited dissenting scientists to sit on important government health panels. Those panels recommended against a large-scale national anti-AIDS campaign, despite an international scientific consensus that it would save lives. Recently, a study from the Journal of Acquired Immune Deficiency Syndromes claims that at least 365,000 extra South African deaths can be blamed on the fallacious viewpoint Mbeki adopted.
It's classic conspiracy theory stuff. An article in the New Scientist describes what denialist movements have in common:
All set themselves up as courageous underdogs fighting a corrupt elite engaged in a conspiracy to suppress the truth or foist a malicious lie on ordinary people. This conspiracy is usually claimed to be promoting a sinister agenda: the nanny state, takeover of the world economy, government power over individuals, financial gain, atheism.
Neither the political right nor the left is free from conspiracy theories. On the right, for example, “birthers” (still) claim that Barak Obama's election was illegitimate because of a fraudulent birth certificate. On the left, conspiracy theories usually involve evil corporations colluding to take over the world economy, market unscrupulous products, kill children with high-fructose corn syrup, dump pollution on us all, or that religious elites are trying to institute an American theocracy. And “Truthers” believe the 9/11 terrorist attacks were masterminded by George W. Bush or a close associate.
Jigsaw Puzzles and Card Houses
Denialism (a word first used in connection with conspiracy theorists who tried to cast doubt on the historicity of the German genocides of WWII) tries to veneer over its irrationality with a paradoxical appeal to science, but without doing the hard work of convincing scientists of an argument. Conspiracy theorists and denialists short-cut the scientific process by relying on anecdote, and by cherry-picking the small number of contrarian scientists and dissenting scientific articles, or by creating their own “science.” They claim that a handful of doubters or a small number of published papers with contrary results undermine the validity of conclusions most scientists would agree with.
So HIV/AIDs denialists, like the vaccine alarmists, trumpet the work of a handful of dissenters, many of whom did real research in the past but whose recent, more ideological work fails to get published because it can’t pass peer review. They see this inability to get published as a sign of persecution and lockout orchestrated by the establishment, rather than a reflection of the quality of their work. They might even accuse the editors of “groupthink” for failing to recognize the brilliance of the dissenters.
Conspiracy theorists look at science as a post-modern exercise of power, instead of as society’s best-faith effort to find coherent explanations for natural observations. The Economist newspaper puts it this way:
In any complex scientific picture of the world there will be gaps, misperceptions and mistakes. Whether your impression is dominated by the whole or the holes will depend on your attitude to the project at hand. You might say that some see a jigsaw where others see a house of cards. Jigsaw types have in mind an overall picture and are open to bits being taken out, moved around or abandoned should they not fit. Those who see houses of cards think that if any piece is removed, the whole lot falls down. …[A]cademic scientists are jigsaw types, dissenters from their view house-of-cards-ists.
Nothing is more frustrating for a credentialed scientist to present their research to general non-academic audience, and then to find themselves facing off during the Q-and-A with a blogger who says "I've done a lot of research on the Internet about this question, and I think your science is a house of cards."
I’m not arguing that an appeal to scientific credentials should resolve debates (it shouldn’t). I’m saying that there are deep and shallow ways to answer questions with science.
To do scientific research is to study the world deeply, to understand the history of and relationships among scientific ideas, to develop questions and (crucially) to design experiments to uncover answers that satisfy not only yourself but a community of skeptical peers. To do science that is believable (or “credible”) one must generally have spent years in formal training and in the workplace demonstrating competence in gathering, analyzing, and interpreting data.
Scientific research is not “looking things up on the Internet,” considering "both sides" of a controversy, and then “forming an opinion.” Science is done by a community which trusts that open communication, decentralized testing of ideas, and relentless questioning will lead us to understand the world more accurately. Science makes progress on the strength of conflict and argument, not by seeking out compatible views.
Science involves a great deal of humility, because most new ideas, even clever ones, turn out to be wrong. Science is predisposed to disbelief. Just because someone says they have a new idea, or they've got a new result from an experiment, does not mean their peers believe them. Even so-called "peer reviewed" research must stand the test of time to be credible. The original study on vaccines and autism cited above (now known to be fraudulent) got through peer review, but it did not withstand subsequent challenges.
The paradoxical things about non-scientists who profess "skepticism" about whether HIV causes AIDS or about the safety of childhood vaccines is that they are so staggeringly unskeptical about the claims of people who agree with them. They are willing to believe that almost all the experts are being duped. It’s faux-skepticism.
In a society where science is highly respected, denialism and conspiracy theories of all kinds are attempts to get power for nothing. Without investing in the hard work of advancing credible, persuasive arguments, denialists of every stripe tend to use the rhetoric of science to convince non-experts of the validity of ideas that can't hold their own in the truly skeptical worlds of science. When we fail to take the scientific enterprise seriously, or when we misappropriate its language and (limited) authority for our own pet causes, we are covering our beliefs in veneer.
Rusty Pritchard is the CEO of Flourish, a ministry that equips Christians to engage the world of environmental science and action. He holds a Ph.D. in natural resource economics and a masters degree in systems ecology.