A debunker is a person or organization that exposes or discredits claims believed to be false, exaggerated, or pretentious. The term is often associated with skeptical investigation of controversial topics such as UFOs, claimed paranormal phenomena, cryptids, conspiracy theories, alternative medicine, religion, or exploratory or fringe areas of scientific or pseudoscientific research.
According to the Merriam-Webster online dictionary, to "debunk" is defined as: "to expose the sham or falseness of." The New Oxford American Dictionary defines "debunk" as "expose the falseness or hollowness of (a myth, idea, or belief)".
If debunkers are not careful, their communications may backfire – increasing an audience's long-term belief in myths. Backfire effects can occur if a message spends too much time on the negative case, if it is too complex, or if the message is threatening.
The American Heritage Dictionary traces the passage of the words "bunk" (noun), "debunk" (verb) and "debunker" (noun) into American English in 1923 as a belated outgrowth of "bunkum", of which the first recorded use was in 1828, apparently related to a poorly received "speech for Buncombe County, North Carolina" given by North Carolina representative Felix Walker during the 16th United States Congress (1819–1821).
The term "debunk" originated in a 1923 novel Bunk, by American journalist and popular historian William Woodward (1874–1950), who used it to mean to "take the bunk out of things".
The term "debunkery" is not limited to arguments about scientific validity; it is also used in a more general sense at attempts to discredit any opposing point of view, such as that of a political opponent.
See also: Science communication
Australian Professorial Fellow Stephan Lewandowsky and John Cook, Climate Communication Fellow for the Global Change Institute at the University of Queensland (and author at Skeptical Science) co-wrote Debunking Handbook, in which they warn that debunking efforts may backfire. Backfire effects occur when science communicators accidentally reinforce false beliefs by trying to correct them, a phenomenon known as belief perseverance.
Cook and Lewandowsky offer possible solutions to the backfire effects as described in different psychological studies. They recommend spending little or no time describing misconceptions because people cannot help but remember ideas that they have heard before. They write "Your goal is to increase people's familiarity with the facts." They recommend providing fewer and clearer arguments, considering that more people recall a message when it is simpler and easier to read. "Less is more" is especially important because scientific truths can get overwhelmingly detailed; pictures, graphs, and memorable tag lines all help keep things simple.
The authors write that debunkers should try to build up people's egos in some way before confronting false beliefs because it is difficult to consider ideas that threaten one's worldviews (i.e., threatening ideas cause cognitive dissonance). It is also advisable to avoid words with negative connotations. The authors describe studies which have shown that people abhor incomplete explanations – they write "In the absence of a better explanation, [people] opt for the wrong explanation". It is important to fill in conceptual gaps, and to explain the cause of the misconception in the first place. The authors believe these techniques can reduce the odds of a "backfire" – that an attempt to debunk bad science will increase the audience's belief in misconceptions.
The Debunking Handbook, 2020, explains that "backfire effects occur only occasionally and the risk of occurrence is lower in most situations than once thought". The authors recommend to "not refrain from attempting to debunk or correct misinformation out of fear that doing so will backfire or increase beliefs in false information".