July 18, 2011
by Jonathan Halvorson
It’s a rusty but sturdy old truism to say that morality binds societies together. As shared outlooks for moral praise and blame dissolve, a society enjoys more the mixed blessings of contention from fragmented sub-groups with divergent political goals and manners of living. The “culture wars” in America and other nations provide easy examples of the dynamic in action.
A flotilla of social sciences have by now devoted literally millions of hours to understanding how these social disagreements arise, explaining how they persist, and providing models of how the differences in belief about what is good can drive differences in beliefs about facts. Shared morality doesn’t just get tied up with a shared outlook on what is good, but a shared outlook on what is.
And so the culture wars and other disagreements about morality (broadly construed) drive wars about the truth of global warming, whether homosexual households are harmful to children, whether deficit spending during a recession spurs economic growth, whether higher taxes on the wealthy hinder economic growth, whether social programs help the poor they are meant to serve, whether torture is a useful method for gathering intelligence, the health effects of pollutants, and on and on.
Your reaction is probably along the lines of: Yes, and what a shame. The facts are what they are whether or not we want to believe them. Truth is cold. We shouldn’t let our beliefs about good and bad influence in any deep way our beliefs about objective facts of the world, especially facts about the causes of things. But, in that same spirit of objectivity, the evidence is also clear: people hate cognitive dissonance and succumb to all kinds of irrational belief generation mechanisms to remove recalcitrant facts from their line of vision. When push comes to shove, it’s the facts that get revised to fit the normative commitments more often than we would like to admit.
Causation--part of Hume’s “cement of the universe”--is at the heart of the mess created by our motivated ways of knowing. The moral cement of societies is mixed in with the causal cement of the universe. This is not just because of our human failings, but because of what we are asking causation to do for us. And it means that we can expect disagreements about why things happen for as long as we disagree about what actions are morally praiseworthy and blameworthy.
It can seem that objectivity is the whole point of our attributions of cause and effect, and yet, in legal contexts, it has long been understood that moral considerations are inseparable from causal judgments. There is simply no way to isolate an event or act as causing an injury in a legally relevant way without attributing responsibility to one out of many contributing causal factors that are equally necessary for the outcome. A car accident has uncountably many specific causal components (visibility, proximity of other vehicles, road conditions, gravitational forces, momenta of the vehicles, etc.), but they get relegated to the background and it is only the driver’s failure to obey the speed limit (or whatever is judged responsible) that gets counted as the cause.
Outside the law, recent experimental work on causal judgments has reinforced that ordinary causal attributions are often influenced not just by empirical factual considerations, but also by broadly moral considerations.
With the exception of physical science, causation, in turns out, is typically about responsibility. It’s about assigning blame for something bad, or credit for something good. Still, all this messiness could be contained relatively neatly if we could treat it all as simply a disagreement over which (factual) causal condition out of many gets dignified with the title “cause.” Then we could say that the causal conditions should be settled by standard scientific methods, and disagreement shows that someone is ignorant or irrational, while disagreement over a “cause” or limited set of “causes” is just a disagreement over which of the conditions is most salient to each person or society.
But it isn’t that simple. Realistically, we rarely have these neat situations in which we agree on all the causal conditions but not which one or ones to call the cause. Disagreements over the causes of poverty are usually also disagreements over some of the causally relevant conditions of poverty. More important, in circumstances related to human affairs (what people do to other people), we have no knowledge of a causal fact of the matter by the standards of natural science to use even if we wanted to. From a scientific perspective, all we have are shifting statistical relationships, often with rather weak partial correlations between the proposed cause and effect. We do not have observed invariant relationships or laws with measured constants to employ as we do in physics, chemistry and engineering. (more on that here and here)
So into the gap between theory, evidence of partial correlations and our practical desire to assign causes, fly our biases and our expectations for appropriate behavior. And they must do so, or else we would have to stop making causal judgments about human affairs altogether.
Posted by Jonathan Halvorson at 12:35 AM | Permalink