If you consider yourself a skeptic and a follower of the principles of science, then you will at some point find yourself debating (or arguing) with a “believer”.  Your goal might be to simply share some knowledge, or perhaps it’s to debunk the believer’s claim outright with the hope that they’ll change their mind.

Changing someone’s mind after they’ve already decided to “believe” in something is, as you may already know, an arduous task.  It’s difficult to not just puff out your chest and list dozens of scientific facts to refute someone’s handful of shady points.  But does that even work?  In my personal experience, (anecdotal, I know), it usually doesn’t.   Sometimes, it even backfires and the believer walks away even more confident in their opinion.

So what’s the best way to debunk misinformation?  John Cook and Stephan Lewandowsky have written a handbook, aptly titled “The Debunking Handbook” to address this issue of the backfire effect, and have provided four simple steps to effectively debunk a myth.

The Backfire Effects:

In the paper, Cook and Lewandowsky describe the 3 ways that debunking can backfire:

The Familiarity Backfire Effect occurs when you try to debunk a myth by inundating the person with facts.  Test show that after being presented with a myth and many facts to debunk the myth, that as time passed, the memory of the details of those facts faded… but the memory of the myth persisted.

The Overkill Backfire Effect occurs because processing many arguments takes more effort than just considering a few.  It turns out that a simple myth is more cognitively attractive than an over-complicated correction.

The Worldview Backfire Effect is potentially the most potent of the three.  It tends to occur with topics that tie in with people’s worldviews and sense of cultural identity.  Several cognitive processes can cause people to unconsciously process information in a biased way (known as Confirmation Bias).  So, for those who are strongly fixed in their worldviews (ex:  Creationists), being confronted with counter-arguments can cause their views to be strengthened.  The paper describes a study which found that even when people are presented with a balanced set of facts, they reinforce their pre-existing views by gravitating towards information they already agree with.

The “Anatomy” of Effective Debunking

Cook and Lewandowsky conclude their paper with three methods to effectively debunk misinformation:

  1. An argument should emphasize the core facts and not the myth.  It’s important to only include the core facts to avoid the Overkill Backfire Effect.
  2. Before mentioning the myth, clear warnings should be given to warn the person that the upcoming information is false.
  3. Alternative explanations should be provided if there are gaps left in the debunking.  Provide an alternative causal explanation for why the misinformation is wrong, and/or consider explaining why the myth was promoted in the first place (ex:  Big Pharma is suppressing the cure for cancer.  But you can eat these X foods and cure it yourself…. Question – is it a coincidence that this person also has a book and DVD set they’re trying to sell?)
  4. Use graphics.  Humans are visual, so core facts should be displayed visually for maximum comprehension and retention.

Example:  Debunking Climate Skeptics

The following (taken from the paper) debunks the myth that there is no scientific consensus about man-made global warming because 31,000 scientists signed a petition stating there is no evidence that human activity can disrupt climate.  Note how the information is presented as an infographic, and follows the 4 guidelines listed above:

How to Debunk Climate Skeptics

The Debunking Handbook is a useful tool for skeptics and critical thinkers alike.  Read it, share it, and use it the next time you need to debunk a myth.

 

Source:

Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of

Queensland. November 5. ISBN 978-0-646-56812-6. [http://sks.to/debunk]