Some people hold beliefs that are false according to the most rigorous, current scientific wisdom. Take, for example, anti-vaccine types’ beliefs about the risks of vaccines (e.g. autism risk).
What’s funny is that at a societal level, we find ourselves in seemingly intractable debates over these beliefs, as if they were moral issues. It’s funny because, from a material rational perspective, at some point and at some level the beliefs should be self-correcting.
What are some explanations? One possibility is externalities. As we are all now acutely aware, in democratic systems these false beliefs can aggregate into policies that threaten even those who hold the correct beliefs. But this works the other way around too. So long as the false beliefs are held by an electoral minority, the electoral majority protects them from their foolishness.
Externalities are sometimes more intrinsic to the issue at hand. With vaccines, current scientific wisdom holds that autism risks are negligible whereas benefits in terms of protection from other diseases are substantial. This particular case is also complicated by “herd immunity,” in that you are affected by your neighbor’s vaccination decision. If the fools reside next to sophisticates, then, again, the fools are protected.
Or, it may be that the costs are borne by future generations, and so do not feed back directly to those taking the consequential decisions. Climate change has this feature: it’s the fools’ children or grandchildren who will suffer. (Although, probably, the more immediate problem is that it is the children of others in faraway places that will suffer the fools’ lack of concern.)
Generally, the externalities explanation relies on the standard public goods logic: investing in learning about rigorous scientific findings is a public good, in which case we should expect widespread underinvestment.
The externalities case is not so tight, though. It seems the US hosts an electoral majority of climate change deniers, although one could attribute this to the intergenerational and interregional externalities. But for the more intrinsic, “herd immunity” kinds of issues, the story is not so clear either. For example, my understanding is that anti-vaccine types tend to cluster in their social interactions (home-schooling and whatnot) and therefore “own” and “neighbors’” actions will tend to be highly correlated.
To fill the gaps from the externalities story, here are some other things we might consider:
- Intrinsic complexity, such that only advanced scientific methods can penetrate these issues and we cannot appeal to direct experience, in which case beliefs depend on some degree of faith.
- Existence of vested immediate interests against the truth and who seek to manipulate the situation.
- Similar coordination and team-signaling dynamics as I discussed with regard to “lies, dupes, and shit tests” (.htm)