A Field Guide to Lies: Critical Thinking in the Information Age (16 page)

BOOK: A Field Guide to Lies: Critical Thinking in the Information Age
9.72Mb size Format: txt, pdf, ePub

The fourfold tables might feel like a strange little exercise, but actually what you’re doing here is scientific and critical thinking, laying out the numbers visually in order to make the computation easier. And the results of those computations allow you to quantify the different parts of the problem, to help you make more rational, evidence-based decisions. They are so powerful, it’s surprising that they’re not taught to all of us in high school.

Thinking About Statistics and Graphs

Most of us have difficulty figuring probabilities and statistics in our heads and detecting subtle patterns in complex tables of numbers. We prefer vivid pictures, images, and stories. When making decisions, we tend to overweight such images and stories, compared to statistical information. We also tend to misunderstand or misinterpret graphics.

Many of us feel intimidated by numbers and so we blindly accept the numbers we’re handed. This can lead to bad decisions and faulty conclusions. We also have a tendency to apply critical thinking only to things we disagree with. In the current information age, pseudo-facts masquerade as facts, misinformation can be indistinguishable from true information, and numbers are often at the heart of any important claim or decision. Bad statistics are everywhere.
As sociologist Joel Best says, it’s not just because the other guys are all lying weasels. Bad statistics are produced by people—often sincere, well-meaning people—who aren’t thinking critically about what they’re saying.

The same fear of numbers that prevents many people from analyzing statistics prevents them from looking carefully at the numbers in a graph, the axis labels, and the story that they tell. The world is full of coincidences and bizarre things are very likely to happen—but just because two things change together doesn’t mean that one caused the other or that they are even related by a hidden
third factor x.
People who are taken in by such associations or coincidences usually have a poor understanding of probability, cause and effect, and the role of randomness in the unfolding of events. Yes, you could spin a story about how the drop in the number of pirates over the last three hundred years and the coinciding rise in global temperatures must surely indicate that pirates were essential to keeping global warming under control. But that’s just sloppy thinking, and is a misinterpretation of the evidence. Sometimes the purveyors of this sort of faulty logic know better and hope that you won’t notice; sometimes they have been taken in themselves. But now you know better.

PART TWO

EVALUATING WORDS

A lie which is half a truth is ever the blackest of lies.
—ALFRED, LORD TENNYSON

H
OW
D
O
W
E
K
NOW
?

We are a storytelling species, and a social species, easily swayed by the opinions of others. We have three ways to acquire information: We can discover it ourselves, we can absorb it implicitly, or we can be told it explicitly. Much of what we know about the world falls in this last category—somewhere along the line, someone told us a fact or we read about it, and so we know it only secondhand. We rely on people with expertise to tell us.

I’ve never seen an atom of oxygen or a molecule of water, but there is a body of literature describing meticulously conducted experiments that lead me to believe these exist. Similarly, I haven’t verified firsthand that Americans landed on the moon, that the speed of light is 186,000 miles per second, that pasteurization really kills bacteria, or that humans normally have twenty-three chromosomes. I don’t know firsthand that the elevator in my building has been properly designed and maintained, or that my doctor actually went to medical school. We rely on experts, certifications, licenses, encyclopedias, and textbooks.

But we also need to rely on ourselves, on our own wits and powers of reasoning. Lying weasels who want to separate us from our money, or get us to vote against our own best interests, will try to
snow us with pseudo-facts, confuse us with numbers that have no basis, or distract us with information that, upon closer examination, is not actually relevant. They will masquerade as experts.

The antidote to this is to analyze claims we encounter the way we analyze statistics and graphs. The skills necessary should not be beyond the ability of most fourteen-year-olds. They are taught in law schools and schools of journalism, sometimes in business schools and graduate science programs, but rarely to the rest of us, to those who need it most.

If you like watching crime dramas, or reading investigative journalism pieces, many of the skills will be familiar—they resemble the kinds of evaluations that are made during court cases. Judges and juries evaluate competing claims and try to discover the truth within. There are codified rules about what constitutes real evidence; in the United States, documents that haven’t been authenticated are generally not allowed, nor is “hearsay” testimony, although there are exceptions.

Suppose someone points you to a website that claims that listening to Mozart music for twenty minutes a day will make us smarter. Another website says it’s not true. A big part of the problem here is that the human brain often makes up its mind based on emotional considerations, and then seeks to justify them. And the brain is a very powerful self-justifying machine. It would be nice to believe that all you have to do is listen to beautiful music for twenty minutes to suddenly take your place at the head of the IQ line. It takes effort to evaluate claims like this, probably more time than it would take to listen to
Eine Kleine Nachtmusik
, but it is necessary to avoid drawing incorrect conclusions. Even the smartest of us can be
fooled.
Steve Jobs delayed treatment for his pancreatic cancer while he followed the advice (given in books and websites) that a change in diet could provide a cure. By the time he realized the diet wasn’t working, the cancer had progressed too far to be treated.

Determining the truthfulness or accuracy of a source is not always possible. Consider the epigram opening Part One:

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.

I saw this at the opening of the feature film
The Big Short
, which attributed it to Mark Twain, and I felt I had seen it somewhere before; Al Gore also used it in his film
An Inconvenient Truth
nine years earlier with the same attribution
.
But in fact-checking the
Field Guide,
I could not find any evidence that Twain ever said this. The attribution and quote itself are prime examples of what the quote is trying to warn us against. The directors, writers, and producers of both films didn’t do their homework—what they thought they knew for sure turned out not to be true at all.

A little web research pulled up
an article in
Forbes
that claims it is a misattribution. The author, Nigel Rees, cites
Respectfully Quoted,
a dictionary of quotations compiled by the U.S. Library of Congress
.
That book reports various formulations of the remark in
Everybody’s Friend
,
or Josh Billing’s Encyclopedia and Proverbial Philosophy of Wit and Humor
(1874). “There you are, you see,” writes Rees. “Mark Twain is a better-known humorist than ‘Josh Billings’ and so the quote drifts towards him.”

Rees continues:

And not only him. In a 1984 presidential debate, Walter Mondale came up with this: “I’m reminded a little bit of what Will Rogers once said of Hoover. He said ‘It’s not what he doesn’t know that bothers me, it’s what he knows for sure just ain’t so.’”

Who’s right? With difficult matters such as this, it is often helpful to consult an expert. I asked Gretchen Lieb, a research librarian at Vassar who works as the liaison to the English Department, and who provided this insightful analysis:

Quotations are tricky things. They’re the literary equivalent of statistics, really, in terms of lies, damn lies, etc. Older quotations are almost like translations from another language, too, in terms of being interpretations rather than verbatim, especially in the case of this circle, since these authors wrote in a sort of fantasy dialect, à la Huckleberry Finn, that is difficult to read and downright disturbing to us now in some cases.
I could go check numerous other books of quotations, such as Oxford, etc., but that would be so twentieth century.
Have you come across HathiTrust? It’s the corpus of books from research libraries that is behind Google Books, and it’s a gold mine, especially for pre-1928 printed materials.
Here’s the Josh Billings attribution in “Respectfully Quoted” (we have it as an e-book; I didn’t need to walk away from my desk!), and it cites the
Oxford Dictionary of Quotations
, which I tend to use more than Bartlett’s:
“The trouble with people is not that they don’t know but that they know so much that ain’t so.” Attributed to Josh Billings (Henry Wheeler Shaw) by
The Oxford Dictionary of Quotations
, 3d ed., p. 491 (1979). Not verified in his writings, although some similar ideas are found in
Everybody’s Friend, or Josh Billing’s Encyclopedia and Proverbial Philosophy of Wit and Humor
(1874). Original spelling is corrected: “What little I do know I hope I am certain of.” (p. 502) “Wisdom don’t consist in knowing more that is new, but in knowing less that is false.” (p. 430) “I honestly believe it is better to know nothing than to know what ain’t so.” (p. 286)
By the way, regarding the Walter Mondale attribution to Will Rogers,
Respectfully Quoted
notes that this has not been found in Rogers’s work.
Here is a link to Billings’s book, where you can search for the phrase “ain’t so” and get the idea of what lies therein: http://hdl.handle.net/2027/njp.32101067175438.
Not verifiable. Plus, if you search for Mark Twain, you find that this compendium/encyclopedia writer cites fellow humorist and smartypants Mark Twain as his most trusted correspondent, so they’re having a conversation and bouncing clever aphorisms, or as Billings would say, “affurisms,” off of each other. Who knows who said what?
I usually roll my eyes when people, especially politicians, quote Mark Twain or Will Rogers, and think to myself, H. L. Mencken, we hardly know you. Critical minds like his are in short supply these days. Poor Josh Billings. Being the second most famous humorist puts you on precarious ground a hundred years later.

So here’s an odd case of a quote that appears to have been utterly fabricated, both in its content and its attribution. The basic idea was contained in Billings, although it’s not clear if that idea came from him, Twain, or perhaps their buddy Bret Harte. Will Rogers gets put in the mix because, well, it just sort of
sounds
like something he would say.

The quote that opens Part Two was given to me by an acquaintance who misremembered it as:

The blackest lie is a partial truth that leads you to the wrong conclusion.

It sounded plausible. It would be just like Tennyson to give color to an abstract noun, and to mix the metaphysical with the practical. I only found out the actual quote (“A lie which is half a truth is ever the blackest of lies”) when fact-checking for this book. So it goes, as Kurt Vonnegut would say.

In the presence of new or conflicting claims, we can make an informed and evidence-based choice about what is true. We examine the claims for ourselves, and make a decision, acting as our own judge and jury. And as part of the process we usually do well to seek expert opinions. How do we identify them?

I
DENTIFYING
E
XPERTISE

The first thing to do when evaluating a claim by some authority is to ask who or what established their authority. If the authority comes from having been a witness to some event, how credible a witness are they?

Venerable authorities can certainly be wrong. The U.S. government was mistaken about the existence of weapons of mass destruction (WMDs) in Iraq in the early 2000s, and, in a less politically fraught case, scientists thought for many years that
humans had twenty-four pairs of chromosomes instead of twenty-three. Looking at what the acknowledged authorities say is not the last step in evaluating claims, but it is a good early step.

Experts talk in two different ways, and it is vital that you know how to tell these apart. In the first way, they review facts and evidence, synthesizing them and forming a conclusion based on the evidence. Along the way, they share with you what the evidence is, why it’s relevant, and how it helped them to form their conclusion. This is the way science is supposed to be, the way court trials proceed, and the way the best business decisions, medical diagnoses, and military strategies are made.

The second way experts talk is to just share their opinions. They are human. Like the rest of us, they can be given to stories, to
spinning loose threads of their own introspections, what-ifs, and untested ideas. There’s nothing wrong with this—some good, testable ideas come from this sort of associative thinking—but it should not be confused with a logical, evidence-based argument. Books and articles for popular audiences by pundits and scientists often contain this kind of rampant speculation, and we buy them because we are impressed by the writer’s expertise and rhetorical talent. But properly done, the writer should also lift the veil of authority, let you look behind the curtain, and see at least some of the evidence for yourself.

Other books

Rodmoor by John Cowper Powys
The Collector by Victoria Scott
Revelations (Bloodline Series) by Kendal, Lindsay Anne
A Green and Ancient Light by Frederic S. Durbin
Into the Storm by Melanie Moreland
Nebula Awards Showcase 2006 by Gardner Dozois
Jenna Kernan by Gold Rush Groom
On the Steel Breeze by Reynolds, Alastair
Driven by Toby Vintcent