A study claims to show that contrary facts can actually reinforce the views of a person who disagrees with them!
Okay, there's the obvious angle of this story: Oh no! People aren't making rational decisions! They aren't basing their opinions on pure facts! Hell, they're reinterpreting facts that are fed to them!
First, the summary:
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
Alright, interesting claim. But ... something seems off about the article. Initially I liked the whole premise, and found it to be almost common sense: People warp facts to fit their preconceived notions. Oddly enough, when you think about it, who isn't going to like this conclusion? Liberals and conservatives and creationists and "evolutionists" and materialists and dualists will all in unison say, "Ha! See? I knew (those guys I disagree with) were doing that. I think they exaggerate about my side though." But when I went back to reread it, things started to seem less and less compelling.
Here's some reasons why.
* First, let's get the obvious out of the way: That innate and unintentional humor pops up. The reporter and research leader alleges that people in general tend to react poorly to "facts", misinterpreting and reinterpreting them so the data better fits with their beliefs. So why should we assume the people carrying out, as well as reporting, this study are immune? If you answered "Because they're scientists / professional journalists", I have some phlogistons to sell you.
* As a side-note, Brendan Nyhan advertises himself as a "Political Scientist". Because he got his PhD in political science, of course. This reminds me of the last post about "naturalism", where the word has social value such that you've got people applying it to themselves in very loose ways. Sociologists, psychologists, and other "soft sciences" already are in a somewhat controversial situation when it comes to the fields as truly and thoroughly scientific or not. Economists, even more delicate shape. But polisci majors? That's really pushing it. I say this, I am ashamed to say, as a political science major.
* Worse, the article uses some wording that just seems... strange.
Take this quote:
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs.
Woah, hold on a moment. Which beliefs are we talking about here? Beliefs particularly about the "facts" in question, or beliefs which are merely relevant to the facts? If I'm against gun control and I mistakenly believe that areas with gun control have X% more/less gun-crime, that doesn't necessarily mean the new data should result in a change in my beliefs. I may be against gun control for reasons that have nothing to do with gun crime.
Also, "corrected facts in news stories"? Now we have another problem, even putting aside the already-mentioned difficulty of regarding something as a "fact" (to say nothing of who is and isn't "misinformed") in research that strongly implies people have trouble grasping "facts". This is apparently what's going on: researchers bring in a person who has preconceived notions about claim X. Then they give the person a news clip implying something relevant to claim X, and at the end add a "factual" correction that disputes said relevant portion. Then they see if the person's views about claim X have changed at all.
Let's try to put this in perspective: Over half of Americans say they distrust the press, and far more Republicans do than Democrats if that report is accurate. So Nyhan conducts research where subjects are given an article from the press (Alternately attributed to either the New York Times or Fox News) with misleading content, and then at the bottom a "factual" correction is offered. Subjects who, apparently, had previously had opinions about and heard facts related to the topic at hand. And the shocking, disappointing result is that the subjects didn't accept the "facts"?
I suppose reporting this research as "Studies show people are less likely to accept data from sources they think aren't trustworthy" wouldn't have been as thrilling.
The more I reread the article, the thicker the scent of bullshit starts to become. So, there's only one thing to do: Roll up my sleeves, read the pdf, and report back.
So, stay tuned for the next post.