Sociological Factors in Science

| By Steven Benner

This blog is the third entry in a series by Steven Benner. The first posts can be found here and here. Throughout the series, Benner discusses the nature of scientific progress and the difficulty of defining what is and is not science. 

Sociological Factors in Science

Last time we saw that ad hoc propositions are undesirable but inevitable byproducts of how science operates. This is largely because science is a fully human enterprise.

Science is set within a culture. Culture, defined broadly, is a collection of generally-accepted models describing reality (Thomas Kuhn used the word "paradigm"; others have called it a "received view"). Paradigms are so well accepted that members of a scientific community may not even think about them explicitly. Of course, a profound part of the culture underlying science is that something like a "reality" exists. Other examples are more specific to time and field, just as the notion that our Sun can be modeled as a large lump of coal (see my lastpost).

Central to the exercise is the recognition that one or more of these paradigms might be wrong. Indeed, individual scientists hope that they might discover that some proposition within the accepted culture is wrong, and become famous for their work that replaces it.

Unfortunately for those hopes, only a subset of the received view is in fact incorrect, and most of that subset is not "ripe" for discovery. Thus, Einstein's general relativity is more correct than Newton's views of how the solar system works, but it was not timely to point this out in 1860; the community was not prepared to discover relativity in 1860 and would not have been able to accept it had it been presented. To paraphrase Clarke, any insufficiently advanced science is indistinguishable from lunacy.

For this reason, practicing scientists who wish to advance their general theoretical framework need to identify parts of the received view that are wrong, but only if that "wrongness" is ready to be found. How do they go about doing this? Generally, scientists start by making observations of the type that have not been previously made. In the case of Francis Collins, one might begin by sequencing the human genome. This is observation on the grand scale, building a model that places over 100 billion atoms in the molecules that support human genetics.

Scientists may then look within these for observations that are puzzling, that are not obviously what is expected given their paradigms, something that needs explanation. Not, for example, that the Sun rose this morning in the East; expected things do not demand explanations. But something that is not, at first glance, like it should be. Explanations are demanded only when things are not as they should be.

Scientists will then attempt to account for the puzzling observation using paradigms within their received view. More often than not, things work out. The puzzle can be resolved. The scientists publish a paper, get promoted within the academy or industry, and move on to the next puzzle.

This exercise, called "normal science" by Thomas Kuhn, sometimes fails. Sometimes, the puzzle cannot be solved by applying the theories and models that are accepted in the culture. This could be, of course, a sign that the puzzling observation was incorrectly made; perhaps the instrument used to make the observation was broken. Alternatively, the scientist might simply not know enough about currently accepted theory to solve the puzzle. Alternatively, failure to solve the puzzle could indicate that the received view contains an incorrect element.

These three possibilities have different prescriptions. If the observation is wrong, we should repeat the observation; if we do not have enough funding to do the experiment again, we must abandon the puzzle. If we do not know enough physics to solve a problem, we should learn more physics, or perhaps hand the puzzle over to someone else who already knows the physics. But if the underlying theory is wrong, we should try to construct an alternative theory, something new.

In these remarks, we find ourselves knee deep in the sociology of science. What scientists actually do depends on what grants they have, what their university dean is telling them, what is going on at home, or any of many other factors that have nothing to do with the science itself. The most common outcome may be to abandon the puzzle to find another that is easier to solve using extant theory. This sociology accounts in part for the rarity with which scientists actually advance theory. Historically, observations that demand the rejection of a paradigm are often known long before some scientist picks up the challenge in the way that actually rejects the paradigm.

But let us follow the thread that has a scientist picking up that challenge and actually proposing a new paradigm, one that rejects a paradigm that is part of the culture. At this point, the dynamics change.

First, the scientist who introduced the theoretical innovation becomes interested in seeing that innovation accepted. Those in the community who did not introduce the innovation do not have this interest. On the contrary, many are interested in opposing the innovation, perhaps because they themselves introduced the soon-to-be-rejected previous paradigm. Of course, those who did not introduce the innovation recognize that they might make theircareer opposing the innovation. If they succeed, then they might be viewed favorably by their peers as "giant killers".

Sociological dangers lie everywhere in this new dynamic. As noted in earlier posts, scientists have control over the data that they consider, but also over the data that they do not consider. Scientists control what experiments they do, but also what experiments they do not do. Scientists decide when to introduce ad hoc hypotheses to explain away apparently negative results, and when not to. This leads to what we might call the first law of argumentation:

If you control the data to accept and reject from experiments that you can choose to run or not, and if you can apply ad hoc propositions as you wish, you can argue yourself into believing just about anything you want.

And scientists who introduce innovative paradigms generally want to believe them, while their opposing scientists often do want not to believe them.

This has parallels in the law, advertising, and politics. A lawyer able to freely select the facts can generally convince you that any of his clients is innocent. The salesman in full charge of his message can persuade you to buy just about any product. Any politician in decent command of his rhetorical craft can pick and choose among things known to you to persuade you to vote for him. And any preacher who is allowed to pick and choose can justify any view of the world, and explain away any apparent contradictions.

What science does that is different is to embody a mechanism to manage this process. Science is an intellectual activity that has a process that forces scientists to occasionally come to believe something other than what they set out to believe or want to believe.

Here we have considered some sociological factors that can create bias within science; next time we will see how the scientific and community processes can mitigate this bias, and explore the tension between accepting and challenging the reigning paradigm.


About the Author

Steven Benner

Steven Benner is a Distinguished Fellow of the Foundation for Applied Molecular Evolution in Gainesville, FL. He received his doctorate in chemistry from Harvard University. Benner and his group of researchers initiated synthetic biology as a field and invented dynamic combinatorial chemistry, which is currently being used in pharmaceutical development.