I’ve discussed before how hard it is to correct false knowledge. This is not only a problem for the classroom — the preconceptions students bring to a topic, and the difficulty of replacing them with the correct information — but, in these days of so much misinformation in the media and on the web, an everyday problem.
An internet study involving 574 adults presented them with an article discussing the issue of electronic health records (EHRs). They were then shown another article on the subject, supposedly from a "political blog". This text included several false statements about who was allowed access to these records (for example, that hospital administrators, health insurance companies, employers, and government officials had unrestricted access).
For some participants, this article was annotated so that the false statements were clearly marked, and directions explained that an independent fact-checking organization had found these factual errors. Other participants completed an unrelated three-minute task at the end of reading the text before being presented with the same corrections, while a third group was not advised of the inaccuracies at all (until being debriefed).
After reading the text, participants were given a questionnaire, where they listed everything they had learned about EHRs from the text, rated their feelings about each item, and marked on a 7-point scale how easy it would be for specific groups to access the records. They were also asked to judge the credibility of the fact-checking message.
Those who received the immediate corrections were significantly more accurate than those who received the delayed corrections, and both were significantly more accurate than those receiving no corrections — so at least we know that correcting false information does make a difference! More depressingly, however, the difference between any of the groups, although significant, was small — i.e., correcting false statements makes a difference, but not much of one.
Part of the problem lies, it appears, in people’s preconceptions. A breakdown by participant’s feelings on the issue revealed that the immediate correction was significantly more effective for those who were ‘for’ EHRs (note that the corrections agreed with their beliefs). Indeed, for those unfavorably disposed, the immediate corrections may as well have been no corrections at all.
But, intriguingly, predisposition only made a difference when the correction was immediate, not when it was delayed.
Mapping these results against participants’ responses to the question of credibility revealed that those unfavorably disposed (and therefore prone to believing the false claims in the text) assigned little credibility to the corrections.
Why should this, perfectly understandable, difference apply only when corrections were immediate? The researchers suggest that, by putting the corrections in direct competition with the false statements, more emphasis is put on their relative credibility — assessments of which tend to be biased by existing attitudes.
The findings suggest it is naïve to expect that it is enough to simply tell people something is false, if they have a will to believe it. It also suggests the best approach to correcting false knowledge is to emphasize the credibility of the corrector.
Of course, this study used politically charged information, about which people are likely to have decided opinions. But the results are a reminder that, as the researcher says: "Humans aren't vessels into which you can just pour accurate information. Correcting misperceptions is really a persuasion task.”
This is true even when the information is something as ‘factual’ as the cause of the seasons! Even teachers should take on board this idea that, when new information doesn’t fit in with a student’s world-view, then credibility of the source/authority (the teacher!) is paramount.