The Baloney Detection Kit: Carl Sagan’s Rules for Bullshit-Busting and Critical Thinking

Necessary cognitive fortification against propaganda, pseudoscience, and general falsehood.

BY MARIA POPOVA Brain Pickings

Carl Sagan (November 9, 1934–December 20, 1996) was many things — a cosmic sage, voracious reader,hopeless romantic, and brilliant philosopher. But above all, he endures as our era’s greatest patron saint of reason and critical thinking, a master of the vital balance between skepticism and openness. In The Demon-Haunted World: Science as a Candle in the Dark(public library) — the same indispensable volume that gave us Sagan’s timeless meditation on science and spirituality, published mere months before his death in 1996 — Sagan shares his secret to upholding the rites of reason, even in the face of society’s most shameless untruths and outrageous propaganda.

In a chapter titled “The Fine Art of Baloney Detection,” Sagan reflects on the many types of deception to which we’re susceptible — from psychics to religious zealotry to paid product endorsements by scientists, which he held in especially low regard, noting that they “betray contempt for the intelligence of their customers” and “introduce an insidious corruption of popular attitudes about scientific objectivity.” (Cue in PBS’s Joe Hanson on how to read science news.) But rather than preaching from the ivory tower of self-righteousness, Sagan approaches the subject from the most vulnerable of places — having just lost both of his parents, he reflects on the all too human allure of promises of supernatural reunions in the afterlife, reminding us that falling for such fictions doesn’t make us stupid or bad people, but simply means that we need to equip ourselves with the right tools against them.

Through their training, scientists are equipped with what Sagan calls a “baloney detection kit” — a set of cognitive tools and techniques that fortify the mind against penetration by falsehoods:

The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re so inclined, if you don’t want to buy baloney even when it’s reassuring to do so, there are precautions that can be taken; there’s a tried-and-true, consumer-tested method.

But the kit, Sagan argues, isn’t merely a tool of science — rather, it contains invaluable tools of healthy skepticism that apply just as elegantly, and just as necessarily, to everyday life. By adopting the kit, we can all shield ourselves against clueless guile and deliberate manipulation. Sagan shares nine of these tools:

  1. Wherever possible there must be independent confirmation of the “facts.”
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Just as important as learning these helpful tools, however, is unlearning and avoiding the most common pitfalls of common sense. Reminding us of where society is most vulnerable to those, Sagan writes:

In addition to teaching us what to do when evaluating a claim to knowledge, any good baloney detection kit must also teach us what not to do. It helps us recognize the most common and perilous fallacies of logic and rhetoric. Many good examples can be found in religion and politics, because their practitioners are so often obliged to justify two contradictory propositions.

He admonishes against the twenty most common and perilous ones — many rooted in our chronic discomfort with ambiguity — with examples of each in action:

  1. ad hominem — Latin for “to the man,” attacking the arguer and not the argument (e.g., The Reverend Dr. Smith is a known Biblical fundamentalist, so her objections to evolution need not be taken seriously)
  2. argument from authority (e.g., President Richard Nixon should be re-elected because he has a secret plan to end the war in Southeast Asia — but because it was secret, there was no way for the electorate to evaluate it on its merits; the argument amounted to trusting him because he was President: a mistake, as it turned out)
  3. argument from adverse consequences (e.g., A God meting out punishment and reward must exist, because if He didn’t, society would be much more lawless and dangerous — perhaps even ungovernable. Or: The defendant in a widely publicized murder trial must be found guilty; otherwise, it will be an encouragement for other men to murder their wives)
  4. appeal to ignorance — the claim that whatever has not been proved false must be true, and vice versa (e.g., There is no compelling evidence that UFOs are not visiting the Earth; therefore UFOs exist — and there is intelligent life elsewhere in the Universe.Or: There may be seventy kazillion other worlds, but not one is known to have the moral advancement of the Earth, so we’re still central to the Universe.) This impatience with ambiguity can be criticized in the phrase: absence of evidence is not evidence of absence.
  5. special pleading, often to rescue a proposition in deep rhetorical trouble (e.g., How can a merciful God condemn future generations to torment because, against orders, one woman induced one man to eat an apple? Special plead: you don’t understand the subtle Doctrine of Free Will. Or: How can there be an equally godlike Father, Son, and Holy Ghost in the same Person? Special plead: You don’t understand the Divine Mystery of the Trinity. Or: How could God permit the followers of Judaism, Christianity, and Islam — each in their own way enjoined to heroic measures of loving kindness and compassion — to have perpetrated so much cruelty for so long? Special plead: You don’t understand Free Will again. And anyway, God moves in mysterious ways.)
  6. begging the question, also called assuming the answer (e.g.,We must institute the death penalty to discourage violent crime. But does the violent crime rate in fact fall when the death penalty is imposed? Or: The stock market fell yesterday because of a technical adjustment and profit-taking by investors — but is there any independent evidence for the causal role of “adjustment” and profit-taking; have we learned anything at all from this purported explanation?)
  7. observational selection, also called the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses (e.g., A state boasts of the Presidents it has produced, but is silent on its serial killers)
  8. statistics of small numbers — a close relative of observational selection (e.g., “They say 1 out of every 5 people is Chinese. How is this possible? I know hundreds of people, and none of them is Chinese. Yours truly.” Or: “I’ve thrown three sevens in a row. Tonight I can’t lose.”)
  9. misunderstanding of the nature of statistics (e.g., President Dwight Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence);
  10. inconsistency (e.g., Prudently plan for the worst of which a potential military adversary is capable, but thriftily ignore scientific projections on environmental dangers because they’re not “proved.”Or: Attribute the declining life expectancy in the former Soviet Union to the failures of communism many years ago, but never attribute the high infant mortality rate in the United States (now highest of the major industrial nations) to the failures of capitalism. Or:Consider it reasonable for the Universe to continue to exist forever into the future, but judge absurd the possibility that it has infinite duration into the past);
  11. non sequitur — Latin for “It doesn’t follow” (e.g., Our nation will prevail because God is great. But nearly every nation pretends this to be true; the German formulation was “Gott mit uns”). Often those falling into the non sequitur fallacy have simply failed to recognize alternative possibilities;
  12. post hoc, ergo propter hoc — Latin for “It happened after, so it was caused by” (e.g., Jaime Cardinal Sin, Archbishop of Manila: “I know of … a 26-year-old who looks 60 because she takes [contraceptive] pills.” Or: Before women got the vote, there were no nuclear weapons)
  13. meaningless question (e.g., What happens when an irresistible force meets an immovable object? But if there is such a thing as an irresistible force there can be no immovable objects, and vice versa)
  14. excluded middle, or false dichotomy — considering only the two extremes in a continuum of intermediate possibilities (e.g., “Sure, take his side; my husband’s perfect; I’m always wrong.” Or: “Either you love your country or you hate it.” Or: “If you’re not part of the solution, you’re part of the problem”)
  15. short-term vs. long-term — a subset of the excluded middle, but so important I’ve pulled it out for special attention (e.g., We can’t afford programs to feed malnourished children and educate pre-school kids. We need to urgently deal with crime on the streets. Or:Why explore space or pursue fundamental science when we have so huge a budget deficit?);
  16. slippery slope, related to excluded middle (e.g., If we allow abortion in the first weeks of pregnancy, it will be impossible to prevent the killing of a full-term infant. Or, conversely: If the state prohibits abortion even in the ninth month, it will soon be telling us what to do with our bodies around the time of conception);
  17. confusion of correlation and causation (e.g., A survey shows that more college graduates are homosexual than those with lesser education; therefore education makes people gay. Or: Andean earthquakes are correlated with closest approaches of the planet Uranus; therefore — despite the absence of any such correlation for the nearer, more massive planet Jupiter — the latter causes the former)
  18. straw man — caricaturing a position to make it easier to attack (e.g., Scientists suppose that living things simply fell together by chance — a formulation that willfully ignores the central Darwinian insight, that Nature ratchets up by saving what works and discarding what doesn’t. Or — this is also a short-term/long-term fallacy —environmentalists care more for snail darters and spotted owls than they do for people)
  19. suppressed evidence, or half-truths (e.g., An amazingly accurate and widely quoted “prophecy” of the assassination attempt on President Reagan is shown on television; but — an important detail — was it recorded before or after the event? Or: These government abuses demand revolution, even if you can’t make an omelette without breaking some eggs. Yes, but is this likely to be a revolution in which far more people are killed than under the previous regime? What does the experience of other revolutions suggest? Are all revolutions against oppressive regimes desirable and in the interests of the people?)
  20. weasel words (e.g., The separation of powers of the U.S. Constitution specifies that the United States may not conduct a war without a declaration by Congress. On the other hand, Presidents are given control of foreign policy and the conduct of wars, which are potentially powerful tools for getting themselves re-elected. Presidents of either political party may therefore be tempted to arrange wars while waving the flag and calling the wars something else — “police actions,” “armed incursions,” “protective reaction strikes,” “pacification,” “safeguarding American interests,” and a wide variety of “operations,” such as “Operation Just Cause.” Euphemisms for war are one of a broad class of reinventions of language for political purposes. Talleyrand said, “An important art of politicians is to find new names for institutions which under old names have become odious to the public”)

Sagan ends the chapter with a necessary disclaimer:

Like all tools, the baloney detection kit can be misused, applied out of context, or even employed as a rote alternative to thinking. But applied judiciously, it can make all the difference in the world — not least in evaluating our own arguments before we present them to others.

The Demon-Haunted World is a timelessly fantastic read in its entirety, timelier than ever in a great many ways amidst our present media landscape of propaganda, pseudoscience, and various commercial motives.

HS-Students-Baloney-Detection-Kit-Sandwich-Infographic

Baloney Detection: How to draw boundaries between science and pseudoscience

By MICHAEL SHERMER

When lecturing on science and pseudoscience at colleges and universities, I am inevitably asked, after challenging common beliefs held by many students, “Why should we believe you?” My answer: “You shouldn’t.”

I then explain we need to check things out for ourselves and, short of that, at least to ask basic questions that get to the heart of the validity of any claim. This is what I call baloney detection, in deference to Carl Sagan, who coined the phrase Baloney Detection Kit.

To detect baloney – that is, to help discriminate between science and pseudoscience – I suggest 10 questions to ask when encountering any claim:

1. How reliable is the source of the claim?

Pseudoscientists often appear quite reliable, but when examined closely, the facts and figures they cite are distorted, taken out of context or occasionally even fabricated. Of course, everyone makes some mistakes. And as historian of science Daniel Kevles showed so effectively in his book The Baltimore Affair, it can be hard to detect a fraudulent signal within the background noise of sloppiness that is a normal part of the scientific process. The question is, Do the data and interpretations show signs of intentional distortion? When an independent committee established to investigate potential fraud scrutinized a set of research notes in Nobel laureate David Baltimore’s laboratory, it revealed a surprising number of mistakes. Baltimore was exonerated because his lab’s mistakes were random and nondirectional.

2. Does this source often make similar claims?

Pseudoscientists have a habit of going well beyond the facts. Flood geologists (creationists who believe that Noah’s flood can account for many of the earth’s geologic formations) consistently make outrageous claims that bear no relation to geological science. Of course, some great thinkers do frequently go beyond the data in their creative speculations. Thomas Gold of Cornell University is notorious for his radical ideas, but he has been right often enough that other scientists listen to what he has to say. Gold proposes, for example, that oil is not a fossil fuel at all but the by-product of a deep, hot biosphere (microorganisms living at unexpected depths within the crust). Hardly any earth scientists with whom I have spoken think Gold is right, yet they do not consider him a crank. Watch out for a pattern of fringe thinking that consistently ignores or distorts data.

3. Have the claims been verified by another source?

Typically pseudoscientists make statements that are unverified or verified only by a source within their own belief circle. We must ask, Who is checking the claims, and even who is checking the checkers? The biggest problem with the cold fusion debacle, for instance, was not that Stanley Pons and Martin Fleischman were wrong. It was that they announced their  spectacular discovery at a press conference before other laboratories verified it. Worse, when cold fusion was not replicated, they continued to cling to their claim. Outside verification is crucial to good science.

4. How does the claim fit with what we know about how the world works?

An extraordinary claim must be placed into a larger context to see how it fits. When people claim that the Egyptian pyramids and the Sphinx were built more than 10,000 years ago by an unknown, advanced race, they are not presenting any context for that earlier civilization. Where are the rest of the artifacts of those people? Where are their works of art, their weapons, their clothing, their tools, their trash? Archaeology simply does not operate this way.

5. Has anyone gone out of the way to disprove the claim, or has only supportive evidence been sought?

This is the confirmation bias, or the tendency to seek confirmatory evidence and to reject or ignore disconfirmatory evidence. The confirmation bias is powerful, pervasive and almost impossible for any of us to avoid. It is why the methods of science that emphasize checking and rechecking, verification and replication, and especially attempts to falsify a claim, are so critical.

When exploring the borderlands of science, we often face a “boundary problem” of where to draw the line between science and pseudoscience. The boundary is the line of demarcation between geographies of knowledge, the border defining countries of claims. Knowledge sets are fuzzier entities than countries, however, and their edges are blurry. It is not always clear where to draw the line. Last month I suggested five questions to ask about a claim to determine whether it is legitimate or baloney. Continuing with the baloney-detection questions, we see that in the process we are also helping to solve the boundary problem of where to place a claim.

6. Does the preponderance of evidence point to the claimant’s conclusion or to a different one?

The theory of evolution, for example, is “proved” through a convergence of evidence from a number of independent lines of inquiry. No one fossil, no one piece of biological or paleontological evidence has “evolution” written on it; instead tens of thousands of evidentiary bits add up to a story of the evolution of life. Creationists conveniently ignore this confluence, focusing instead on trivial anomalies or currently unexplained phenomena in the history of life.

7. Is the claimant employing the accepted rules of reason and tools of research, or have these been abandoned in favor of others that lead to the desired conclusion?

A clear distinction can be made between SETI (Search for Extraterrestrial Intelligence) scientists and UFOlogists. SETI scientists begin with the null hypothesis that ETIs do not exist and that they must provide concrete evidence before making the extraordinary claim that we are not alone in the universe. UFOlogists begin with the positive hypothesis that ETIs exist and have visited us, then employ questionable research techniques to support that belief, such as hypnotic regression (revelations of abduction experiences), anecdotal reasoning (countless stories of UFO sightings), conspiratorial thinking (governmental cover-ups of alien encounters), low-quality visual evidence (blurry photographs and grainy videos), and anomalistic thinking (atmospheric anomalies and visual misperceptions by eyewitnesses).

8. Is the claimant providing an explanation for the observed phenomena or merely denying the existing explanation?

This is a classic debate strategy–criticize your opponent and never affirm what you believe to avoid criticism. It is next to impossible to get creationists to offer an explanation for life (other than “God did it”). Intelligent Design (ID) creationists have done no better, picking away at weaknesses in scientific explanations for difficult problems and offering in their stead. “ID did it.” This stratagem is unacceptable in science.

9. If the claimant proffers a new explanation, does it account for as many phenomena as the old explanation did?

Many HIV/AIDS skeptics argue that lifestyle causes AIDS. Yet their alternative theory does not explain nearly as much of the data as the HIV theory does. To make their argument, they must ignore the diverse evidence in support of HIV as the causal vector in AIDS while ignoring the significant correlation between the rise in AIDS among hemophiliacs shortly after HIV was inadvertently introduced into the blood supply.

10. Do the claimant’s personal beliefs and biases drive the conclusions, or vice versa?

All scientists hold social, political and ideological beliefs that could potentially slant their interpretations of the data, but how do those biases and beliefs affect their research in practice? Usually during the peer-review system, such biases and beliefs are rooted out, or the paper or book is rejected.

Clearly, there are no foolproof methods of detecting baloney or drawing the boundary between science and pseudoscience. Yet there is a solution: science deals in fuzzy fractions of certainties and uncertainties, where evolution and big bang cosmology may be assigned a 0.9 probability of being true, and creationism and UFOs a 0.1 probability of being true. In between are borderland claims: we might assign superstring theory a 0.7 and cryonics a 0.2. In all cases, we remain open-minded and flexible, willing to reconsider our assessments as new evidence arises. This is, undeniably, what makes science so fleeting and frustrating to many people; it is, at the same time, what makes science the most glorious product of the human mind.

Michael Shermer is founding publisher of Skeptic magazine and author of The Borderlands of Science.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s