Jul. 11th, 2010

edschweppe: (vote at your own risk)
There's an interesting article in today's Boston Globe about how people respond to news stories that happen to confirm their preexisting notions - and, more importantly, fail to respond to stories that tend to disprove them:

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
(emphasis in original)

It's probably ironic that one of my first reactions to this article was to wonder how I could check the writer's claims. On the other hand, it makes intuitive sense to me that someone holding a strong belief about X is very likely to grant great credence to stories that support that belief, and great skepticism towards stories that refute it.

And it certainly explains the fruitlessness of much political discourse.
edschweppe: (vote at your own risk)
There's an interesting article in today's Boston Globe about how people respond to news stories that happen to confirm their preexisting notions - and, more importantly, fail to respond to stories that tend to disprove them:

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
(emphasis in original)

It's probably ironic that one of my first reactions to this article was to wonder how I could check the writer's claims. On the other hand, it makes intuitive sense to me that someone holding a strong belief about X is very likely to grant great credence to stories that support that belief, and great skepticism towards stories that refute it.

And it certainly explains the fruitlessness of much political discourse.

Profile

edschweppe: Myself in a black suit and black bow tie (Default)
Edmund Schweppe

February 2025

S M T W T F S
      1
2345678
9101112131415
16171819202122
2324252627 28 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags