August 22, 2011
Things You Cannot Believe
Early in the 20th Century, the British philosopher G. E. Moore noticed that sentences of a certain form have a quite peculiar feature. Consider:
I believe it is Tuesday, but today is Monday.
Today is Monday, but I do not believe that.
I believe that today is Tuesday, but it’s not true that today is Tuesday.
These statements, when considered as first-personal assessments, instantiate what’s been called Moore’s Paradox. Taking ‘p’ as a variable standing for any well-formed declarative sentence, we can say that Moore’s Paradox is generated by any statement of the following form,
I believe that p, but not-p.
What is peculiar about statements of this kind is that although they may be true, you cannot believe them to be true in your own case. Although you may, indeed, be mistaken about what day today is, you cannot assess yourself as being mistaken about the day without undoing your belief about what day today is. When we assess one of our beliefs as false, we typically thereby dissolve the belief. Put otherwise, there are some truths that cannot be believed. That’s the paradox.
What are we to make of this? Philosophers have proposed various accounts of the significance of Moore’s Paradox. One clear implication is that beliefs are intrinsically truth-aiming. When one believes, one aims to believe what is true. This is why falsity is a decisive objection to a belief. When one finds oneself driven to affirm something that one regards as false, the language of belief no longer seems appropriate; one instead employs diagnostic terms, such as affliction, addiction, and delusion. We may say, then, that truth is the norm of belief.
The realization that belief is governed by the norm of truth leads us to wonder whether there are additional norms appropriate to belief. We suggest that the following statements have a quasi-Moorean flavor:
I believe that p, but my evidence has been rigged in order to favor p.
I believe that p, but my sources of information are highly censored by those who favor p.
I believe that p, but my evidence is unreliable and spotty.
I believe that p, but my evidence is consistent with not-p.
Or consider social versions of these sorts of thoughts:
I believe that p, but all critics of p have been intimidated into silence or otherwise marginalized.
I believe that p, but I always lose well-conducted arguments with reasonable critics of p.
We think that self-assessments like these show that there are, indeed, additional norms that govern belief. Beliefs aim at truth, and the way they aim at truth is by aiming to be responsive to reasons and evidence. To discover of a belief that it was formed under epistemically improper conditions is to see the belief as in some respect defective and in need of attention. In short, when one believes, one aims to be responsive to the best available reasons and evidence.
Some may want to object. They will say that examples of irresponsible belief are too easy to find; they will say that most people do not care one whit for reasons and evidence, but only for keeping their beliefs in place, come what may.
It is true that conflation, confabulation, deception, and misdirection are common responses to the discovery of a false belief. We are, it seems, epistemically conservative by nature. We do not like to have to change our minds, and so we do not like to reexamine and reevaluate our beliefs. But notice that these observations do not counteract our main contention; in fact, they provide further support for it. If it were not the case that beliefs aim at truth and responsiveness, there would be no need to deploy these tactics in light of countervailing evidence and criticism. If truth and responsiveness simply did not matter, people would not care about changing their minds, and they would not expend the effort they often do in discrediting, dismissing, and smearing their critics. The fact that people rationalize just shows that even when they fail to be rational, they nevertheless try to look like they are still succeeding.
The norms that govern belief have a surprising implication for political philosophy. In order to assess our beliefs as proper, and ourselves as cognitively responsible, we must be able to assess our cognitive environment as epistemically reliable. That is, we must be able to regard ourselves as functioning under social and political conditions which can be counted on to make available to us reliable information, sound data, and uncoerced expert analysis. We also must be able to assess ourselves as functioning under conditions in which dissent is protected, disagreement is permitted, and reasonable controversy is not squelched.
In short, responsible believing in the first-personal case requires a social epistemic system marked by norms of free inquiry, freedom of expression, freedom of conscience, and reasonable disagreement. In order to assess ourselves as being proper believers, we must be able to plausibly regard ourselves as citizens in an Open Society of the kind championed by J. S. Mill, Karl Popper, Bertrand Russell, John Dewey, and John Rawls. Proper epistemological practice entails democratic social norms. Is that so hard to believe?
Posted by Scott F. Aikin and Robert B. Talisse at 12:55 AM | Permalink