Making Juries Better: Some Ideas from Neuroeconomics

Scales-justice-990x643

Virginia Hughes over at the National Geographic's Only Human:

We Americans love jury trials, in which an accused person is judged by a group of peers from the community. Every citizen, when called, must sit on a jury. For anyone who finds this civic duty a painful chore: Go watch12 Angry Men, A Few Good Men, or any episode of Law & Order. You’ll feel all warm and fuzzy with the knowledge that, though juries don’t always make the right call, they’re our best hope for carrying out justice.

But…what if they aren’t? Juries are made of people. And people, as psychologists and social scientists have reported for decades, come into a decision with pre-existing biases. We tend to weigh evidence that confirms our bias more heavily than evidence that contradicts it.

Here’s a hypothetical (and pretty callous) example, which I plucked from one of those psych studies. Consider an elementary school teacher who is trying to suss out which of two new students, Mary and Bob, is smarter. The teacher may think of them as equally smart, at first. Then Mary gets a perfect score on a vocabulary quiz, say, leading the teacher to hypothesize that Mary is smarter. Sometime after that, Mary says something mildly clever. Objectively, that one utterance shouldn’t say much about Mary’s intelligence. But because of the earlier evidence from the quiz, the teacher is primed to see this new event in a more impressive light, bolstering the emerging theory that Mary is smarter than Bob. This goes on and on, until the teacher firmly believes in Mary’s genius.

Even more concerning than confirmation bias itself is the fact that the more bias we have, the more confident we are in our decision.

All of that research means, ironically, that if you start with a group of individuals who have differing beliefs, and present them all with the same evidence, they’re more likely to diverge, rather than converge, on a decision.

More here.