Last week, after I returned home from the grocery store — bags of bacon, lunch meat, and hot dogs in tow — my wife announced, “There’s a documentary you need to watch. It’s all about how this food is bad for us.”[1]

The film in question was What the Health, a documentary by Kip Andersen of Cowspiracy fame. Among the film’s central claims is that processed meat — and to a lesser extent, nonprocessed beef — can give you cancer, according to the World Health Organization. In fact, the WHO classifies processed meat in the same carcinogenic category (Group 1) as cigarettes, asbestos, and plutonium.

According to Anderson and his research sources, such a classification means that “Processed Meats Cause Cancer.” This is an extraordinary claim that affords us an opportunity to clear up some confusion about how scientists talk about evidence.

Let’s talk about where the confusion lies.

Precision vs. “Oomph”

When assessing the world, scientists (at least, in the life and social sciences) are primarily concerned with two things: the amount of confidence in a given finding, and the strength of that finding. The two sound very similar, but they are not the same. The difference is outlined by the economists Stephen Ziliak and Deirdre McCloskey, who delineate between what they call precision and “oomph.”

Say a drug company has developed a new blood pressure medication and submits it for the requisite clinical trials to obtain FDA approval. In its simplest form, this process might involve conducting an experiment wherein 500 people with high blood pressure are recruited and randomly split into two groups. Half of all participants would be given the real drug (the experimental group), and the other half would be given a placebo (the control group). After a specified amount of time, blood pressure levels would be measured and the differences between the groups would be compared.

Let’s imagine that the experimental group exhibited an average systolic blood pressure of 149, while those in the control group averaged 150. This outcome seems unimpressive, but what if every single person in the experimental group ended up one point lower than when they entered the trial, while every single person in the control group stayed exactly the same?

In this case, we can be fairly confident that the drug had a real effect. However, it wasn’t very powerful. It was reliably mediocre at reducing blood pressure.

Now let’s imagine that the experimental group exhibited an average systolic blood pressure of 130, while those in the control group averaged 150. In this case, we can be fairly confident that the drug has a real effect, and a strong one.

Hotdogs and Cancer Risk

As it turns out, the WHO’s conclusions about the effect of processed meat consumption on cancer risk are much more like the former (low oomph) case than the latter (high oomph). In fact, its Q&A explicitly states the following:

Q: Processed meat was classified as carcinogenic to humans (Group 1). Tobacco smoking and asbestos are also both classified as carcinogenic to humans (Group 1). Does it mean that consumption of processed meat is as carcinogenic as tobacco smoking and asbestos?

A: No, processed meat has been classified in the same category as causes of cancer such as tobacco smoking and asbestos (IARC Group 1, carcinogenic to humans), but this does NOT mean that they are all equally dangerous. The IARC classifications describe the strength of the scientific evidence about an agent being a cause of cancer, rather than assessing the level of risk.

So the WHO is confident that consuming processed meat causes cancer — presumably, just as confident as they are that smoking tobacco causes cancer. But they are not saying that the two are equally carcinogenic. As such, comparisons between the two are wholly inappropriate. Another way of thinking about this is in the adage many of us learned in high school chemistry: the dose makes the poison.

The WHO has gone to great lengths to clear up the confusion over the dangers of consuming processed meat — a fact that Anderson neatly skips over.

Why We Need to Understand How Scientific Findings Work

We are bombarded by unscrupulous, “scientific” claims every day. We are told that a certain substance is good or bad for us — summarized in plain terms by official-sounding bodies. Politicians do it, too: Harry Reid claimed that the Zika virus causes blindness (no). Attorney General Jeff Sessions has lumped marijuana in with other drugs and declared his intention to reinstantiate the drug war — a move that goes against all available evidence about marijuana’s supposed dangers.

Science is messy and complicated. Most of us — even the educated — are not trained to understand the nuances inherent to science. In statistics courses, I constantly urge my graduate students to attend to both precision and oomph, but the majority of them were unfamiliar with the distinction throughout their undergraduate years.

Getting people to attend to nuances in scientific findings is notoriously difficult, in large part because humans are what psychologists refer to as “cognitive misers” — we don’t like thinking too much about an issue if we can avoid it. After all, thinking is hard! As such, simple explanations like “X causes cancer, but Y does not” are incredibly appealing. It’s also a major reason why humans rely on heuristic reasoning, even when doing so is inappropriate.

Politicians, activists, and regulatory agencies take advantage of our natural proclivity for lazy thinking (or avoiding thinking altogether) in their attempts to influence our behavior.

Don’t let them. Do a little extra thinking. Explore the data for yourself. Ask questions.

And above all, enjoy your bacon.

[1] I am firmly of the opinion that “there’s a thing on Netflix you need to watch,” “there’s a book you need to read,” and “we need to talk” are among the most terrifying domestic utterances that occur with regularity. It’s worse in my case, as my wife is a well-educated, intelligent woman, so if she says I should look into something, I take her recommendation seriously.