"How Facts Backfire"
Jul. 11th, 2010 07:49 pmThere's an interesting article in today's Boston Globe about how people respond to news stories that happen to confirm their preexisting notions - and, more importantly, fail to respond to stories that tend to disprove them:
It's probably ironic that one of my first reactions to this article was to wonder how I could check the writer's claims. On the other hand, it makes intuitive sense to me that someone holding a strong belief about X is very likely to grant great credence to stories that support that belief, and great skepticism towards stories that refute it.
And it certainly explains the fruitlessness of much political discourse.
In the end, truth will out. Won’t it?(emphasis in original)
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
It's probably ironic that one of my first reactions to this article was to wonder how I could check the writer's claims. On the other hand, it makes intuitive sense to me that someone holding a strong belief about X is very likely to grant great credence to stories that support that belief, and great skepticism towards stories that refute it.
And it certainly explains the fruitlessness of much political discourse.