Editorial in Nature:
The human brain’s habit of finding what it wants to find is a key problem for research. Establishing robust methods to avoid such bias will make results more reproducible.
The sources and types of such cognitive bias — and the fallacies they produce — are becoming more widely appreciated. Some of the problems are as old as science itself, and some are new: the IKEA effect, for example, describes a cognitive bias among consumers who place artificially high value on products that they have built themselves. Another common fallacy in research is the Texas sharp-shooter effect — firing off a few rounds and then drawing a bull’s eye around the bullet holes. And then there is asymmetrical attention: carefully debugging analyses and debunking data that counter a favoured hypothesis, while letting evidence in favour of the hypothesis slide by unexamined. Such fallacies sound obvious and easy to avoid. It is easy to think that they only affect other people. In fact, they fall naturally into investigators’ blind spots (see page 182). Advocates of robust science have repeatedly warned against cognitive habits that can lead to error. Although such awareness is essential, it is insufficient. The scientific community needs concrete guidance on how to manage its all-too-human biases and avoid the errors they cause.