We live in a golden era of science literature written for consumption by the general public. The interested reader can find clear, accessible, and engaging accounts of current research and theories in everything from basic physics to psychology and sociology. This is a great public good, for its intrinsic value to inquiring minds no less than for its importance to informed political and social decision-making. However, as every scientist knows, ‘popularizing’ carries risks. Weaving a compelling narrative is vital to attracting a broad readership. And as for what the press does when it comes to reporting findings—well, the less said, the better.
One topic of vital human significance is increasingly a focus of popular scientific narrative: the sources of human choice and action and their implications for moral responsibility. A narrative that has begun to set in—though there are contrary voices—is that scientific studies of the springs of human action show that our ordinary understanding of ourselves as free and morally responsible persons is largely or wholly mistaken, a “myth.” On that ordinary understanding, we freely make choices with recognized purposes, while being aware of alternatives open to us. Given that we also generally recognize the moral significance of those choices, we bear moral responsibility for those choices and the actions that flow from them, and properly hold one another to account as objects of either praise or blame. This picture is embedded in ordinary moral thought and is central to the Christian understanding of human persons. However, as the story some are now telling goes, this ordinary picture is just wrong. The story’s proponents argue that in fact human actions are driven by motivations of which we are mostly unaware and are initiated by unconscious processes. Far from making the final call, our conscious minds are but after-the-fact ‘news sites’ receiving reports from the real drivers of action occurring down below—and the reports are quite often misleading if not downright mistaken, at that. Philosopher Eddy Nahmias has aptly dubbed proponents of this view “willusionists.”
Now, to any reader of good nineteenth-century literature, or indeed the Bible, it will not come as news to be told that we don’t always fully understand our own motivations, or that we are sometimes pulled in multiple directions in ways that we either can’t or simply won’t consciously reflect upon. But for all that, we suppose, we do in the end make choices that initiate and guide our behavior, we generally know what it is we are doing, and we often know at least some of our major motivations for so doing. Our self control is highly imperfect, but it is real. What sort of scientific evidence could threaten such a fundamental feature of our self-understanding?
The willusionist conclusion is drawn from studies that support the following more specific claims:
- We are susceptible to factors unconsciously causing us to revise or even confabulate beliefs about what we have previously done, or why. For example, we can be pressured into believing that we have done something that we didn’t do, and we seem to be prone to revising our beliefs about why we previously acted in a certain way in order to make it better fit with our own self-understanding.
- We are susceptible to environmental factors influencing, largely or wholly unconsciously, the choices we make.Studies have suggested that placing a hidden, powerful magnet directly behind one side of a person’s head will influence an arbitrary choice they make concerning which hand to move, and that noxious but harmless environmental stimuli such as a loud jackhammer or a foul odor will influence whether we stop to help someone in need on a street.
- There are striking pathological cases (alien/anarchic hand syndrome, severe schizophrenia) in which people engage in purposive behavior that they do not believe that they are controlling. For example, a person suffering from alien hand syndrome may reach in a smooth way for someone else’s drink, while registering conscious surprise or dismay at seeing this happen.
- Finally, neuroscientists have claimed to detect a kind of preparatory brain activity prior to the time at which a person perceives himself as consciously willing to act. The conclusion that Benjamin Libet, who pioneered these studies, and others have drawn is that your brain “decides” to act before you, the conscious person, does. What you experience as conscious choice is in reality an after-the-fact awareness of what has been decided unconsciously.
Now, there are criticisms of some of the studies whose results seem most striking. For example, some of the studies purporting to show profound environmental influences on some moral choices are based on small sample sizes, track average group results rather than individual variations (which may be large), and are subject to rather different interpretations. And the basic experimental design of the studies concerning brain activity prior to conscious awareness of choice have been criticized on the grounds that people’s judgments concerning the timing of conscious episodes generally are notoriously unreliable. But we may here set aside such unresolved issues. For, even taking the criticized studies along with the others at face value, the cumulative weight of the above findings provides nothing like a compelling case for willusionism. Let’s take them one by one.
- We are susceptible to factors unconsciously causing us to revise or even confabulate beliefs about what we have previously done, or why.
Our memories are imperfect records even of what we have experienced. That some of the imperfection owes to our unwitting tendency to silently revise that record is disconcerting. But this is simply not relevant to the question of whether we were in control and reasonably self-aware at the time of conscious choice/action itself. Flaws in our means of information retention is one thing, control over our conscious choices is quite another. This first ground for willusionism swings and misses.
- We are susceptible to environmental factors influencing, largely or wholly unconsciously, the choices we make.
There is no question that moral responsibility for the choices we make depends on our knowing what we are doing and why, at least to a significant degree. Notice that these two forms of knowledge are not all or nothing; they come in degrees. If it is true that some of the time we are not aware of major influences on our actions, that may suggest a correspondingly diminished degree of moral responsibility—assuming we bear no culpability for this ignorance. However, it would be absurd to generalize from this to the conclusion that we are generally clueless about our own behavior, and there is simply no data at all that supports such a sweeping conclusion. Furthermore, we may note that scientists must know what they are doing and why—to a significant extent—if the ideas that are based on the outputs of such actions are to be accepted as more or less well supported. To contend otherwise would be to saw off the branch on which such science sits.
- There are striking pathological cases in which people engage in purposive behavior they do not believe that they are controlling.
These cases establish that purposive action can issue from unconscious sources. They do not suggest that this is generally the case—that is why they are considered pathological. A person afflicted with alien hand syndrome may not consciously control the movement of one of his arms, but he is like the rest of us in other respects when it comes to choices and decisions.
- Preparatory brain activity prior to the time at which a person perceives himself as consciously willing to act.
I do not have space to summarize the extensive discussion [pdf] Libet’s studies have sparked. It will suffice to make two points. First, there is very good reason to believe that such generalized electrical activity in the brain prior to the subjects’ decision is not sufficient for the choices they make—the subject is capable of refraining from the choice even in the presence of such activity. Second, the cases involve uninteresting choices (‘freely choose to wiggle the index finger of your left or right hand’), done many times over in a single sitting, and require the subject to ‘observe’ himself in the act of choice. All of these features are unrepresentative of significant moral choices, and so generalizations from what may occur in the experimental setting—recall that there are serious criticisms of the experimental design, typically unreported in popular discussions—is plainly rash.
I believe it is fair to conclude that the supposed threat posed to human free will and moral responsibility from the very much work-in-progress social and cognitive human sciences is quite overblown. These sciences are beginning to teach us some things about the fragility and limits of our freedom of choice, and undoubtedly will teach us still more in the years ahead. But it is worth emphasizing that the reality of such limits is deeply consonant with the Christian view of human beings: neither environment-transcending demi-gods nor environment-driven robotic slaves, we are instead dependent and fallen yet responsible creatures.