UA-9726592-1

Wednesday, July 21, 2010

FDL News Desk: The misinformed cannot be confused with the facts

By: David Dayen Monday July 12, 2010 6:50 am
A truly disturbing study from researchers at my alma mater, the University of Michigan, reveals that political partisans reacted to facts that contradicted their worldview by clinging closer to their worldview.

In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds.

In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds.

The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

. . . But that doesn’t appear to be the American character. More people think with their gut than their brain, to paraphrase Stephen Colbert. As Newsweek (noted George W. ) Bush “still trusts his gut to tell him what's right, and he still expects others to follow his lead.”


One might have thought Bush would have learned by now to view the proclamations of his gut with some suspicion—but then, that would be asking the president to rely on evidence and experience to make conclusions.

But the research on the subject shows this phenomenon as part of the human condition, the desire to order facts around a particular view of the world. Though it should be noted that the literature basically finds this to be more prevalent on the conservative side of the ledger, which if you understand the term “conservative” to be wedded to the status quo makes a fair bit of sense.

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge.

In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted).

Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire.

The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

Interestingly, one antidote researchers have found to this is self-esteem. Respondents who felt good about themselves were consistently more willing to accept new information, whereas those who felt threatened or agitated – say, your average Rush Limbaugh listener – were not.

Another way to get facts to stick is through direct appeals. Yet media consumers get their information indirectly, through filters and outlets they either trust or imagine to have a bias, and they set their perceptions accordingly.

For individuals, broadening your sources of information probably helps to find a consensus on some facts. But I wouldn’t be so sure it would work. We’re rapidly moving to a post-truth era in politics, and the data suggests that the agreed-upon set of facts has gone the way of the dinosaur.

Rightardia is not surprised. Most conservative hold conventional 'in the box' political views. Liberals and progressives are better able to entertain 'out of the box' post conventional views.   Some of this is a function of education.


Anyone who has posted on Usenet for a while will know that the hard corp partisans aren't there to change the opinion of most of the other posters.


Many of the right wingers try to change the subject when they don't like a thoughtful well-documented post or try to insult the poster. Others mark and move canned comments like this poster is a known liar.  

The right wing has never been known for being confused by the facts.

source: http://www.tompaine.com/articles/2006/07/26/bush_gut_check.php

Subscribe to the Rightardia feed: feeds.feedburner.com/blogspot/IGiu 

Netcraft rank: 14896
http://toolbar.netcraft.com/site_report?url=http://rightardia.blogspot.com

No comments: