Reporting Numbers: Don’t Lie by Omission

By Keith Devlin @profkeithdevlin

How significant is the associated health risk?

How significant is the associated health risk?

Spending several years of my early mathematical career in Germany, I long ago adopted a European lifestyle habit of having a glass or two of wine with a meal. So naturally, I had more than a little interest in a recent article in the online outlet Medium, with the headline The buzzkill news about drinking alcohol, which kicked off with this lede paragraph:

It was so comforting to think that a daily glass of wine or a stiff drink packed health benefits, warding off disease and extending life. But a distillation of the latest research reveals a far more sobering truth: Considering all the potential benefits and risks, some researchers now question whether any amount of alcohol can be considered good for you.

My attention had in fact been drawn to the article by a family member who wondered if I should reassess my long-held assumption that a daily glass of wine may have overall, long-term, net health benefits. 

The “evidence” to support my assumption is a correlation between countries where moderate wine drinking is common and countries with longer average lifespans, coupled with a reasonable belief that engaging in activities we like and which make us relaxed with an overall feeling of well-being, tend to have a beneficial effect on our health. (There have also been reports of positive effects on the health of the arteries and a reduction of stroke likelihood.) Hardly scientifically conclusive, to be sure, but sufficient (at least for me) in the absence of any scientific evidence to the contrary.

But, according to the new article, I do need to reassess my assumption. The article cited a recent study of 500,000 adults in China, published in The Lancet, so it was not something I was going to dismiss. 

Going by the direct quotations of one of the scientists who conducted the study, it seems there is indeed now solid evidence that drinking as little as one glass of wine a day has a negative effect on your health.

While not doubting the new scientific evidence, however, I wondered if the situation was really as dire as the alarmist Medium article claimed. Over the years, I have grown weary of “gee whiz” journalistic articles about scientific advances and “doom and gloom,” scare articles about discoveries in medicine, where the writer’s stance is not remotely supported by the actual evidence. Before I would entertain giving up my beloved dinner-table Pinot Noir, I wanted to know just what the new study had actually shown.

Was there a cause for alarm? Just how significant is that negative effect? 

For instance, is it sufficient to outweigh the positive effects on health and longevity of a relaxed lifestyle with regular, pleasurable moments, one of them being a glass of good wine with dinner? That particular question is hard to answer since we have no way to quantify those life benefits numerically with any accuracy.

But even if we ignore that factor, there remains a significant quality-of-life question: Does the study provide evidence sufficient to convince us (or at least me) to give up an activity we find pleasurable? This was the question that really interested me.

Fortunately, the article included just enough of the key data from the study for me to quickly do some simple arithmetical calculations to determine that the increased risk from drinking a glass of wine a day, while non-zero, is minuscule—well below the risk of ending up in the emergency room after an accident while taking a shower. Sales of wine are not going to suffer because of a change in my mealtime choice of beverage.

I did not need to consult The Lancet paper to perform these calculations. The writer of the Medium article provided all the information required to bely his alarmist stance. So, full marks to the writer for accurately reporting the key data, but low (or negative?) marks for pitching the story in an alarmist fashion that the data simply does not support.

You can read my brief summary of the alarmist “buzzkill” article and my equally brief numerical analysis of the data that debunked the journalist’s conclusion, in my recent post for the Stanford Mathematics Outreach Project (SUMOP). The math, insofar as it deserves that description, is little more than a few rough-and-ready calculations based on “make-the-situation-worse” numerical simplifications. More number sense than arithmetic. My only reason for writing it up is that I think it provides an excellent illustration of the importance of developing number sense in our students as a crucial ability for life in today’s world. (The importance of number sense is a repeated SUMOP theme.)

The point of this post, however, is the importance of, not only accurate, but also adequate reporting of results based on statistical information. Many journalists seem poorly informed on how to present statistical information in a responsible fashion. Those of us in mathematics education should do all we can, whenever we have the opportunity, to try to rectify that problem.

We should also be sure that, when we ourselves present statistical data to mathematically-lay audiences (in our teaching, speaking, and writing), we too make sure we report responsibly. 

Fortunately, there is an excellent resource for this. The BBC publishes (online) various Editorial Guidelines for its reporters, including an informative guide on Reporting Statistics.

The BBC Editorial Guide on reporting statistics responsibly is a superb resource

The BBC Editorial Guide on reporting statistics responsibly is a superb resource

I strongly recommend it to anyone who is planning on addressing, or writing for, a general audience on a topic involving numerical data. It’s not long.

Let me leave you with three short extracts to whet your educator’s appetite.

Early on, the Guide lists some key questions you should, if you can, ask about the source of the statistics:

  • WHO has produced the statistics? How reliable is the source?

  • WHY have the statistics been produced and why now? What question is the statistic an answer to? Does the source have a vested interest or hidden agenda?

  • HOW have the statistics been compiled? What exactly has been counted? Are the underlying assumptions clear?

  • WHAT does the statistic really show? Does the study really answer what it set out to test? What are the producers of the statistics not telling you? Avoid automatically taking statistics at face value.

  • WHERE can you find the underlying data and is it available? 

The Guide goes on to observe:

When our output includes statistics, we should explain the numbers, put them into context, weigh, interpret and challenge and present them clearly. The statistics must be accurate and verified where appropriate, with important caveats and limitations explained. We should use a range of evidence to put statistical claims into context and help audiences to judge their magnitude and importance. Where claims are wrong or misleading, they should be challenged. 

Statistics can easily be overstated and exaggerated to make a story look dramatic. So it is important that we use statistics accurately, explaining any caveats and limitations where appropriate. We should report statistics in context to make them meaningful and ensure our audiences understand their significance, taking care to avoid giving figures more weight than can stand scrutiny. Beware of using statistics in headlines where figures are rarely meaningful without context.

And, a bit later:

Just because a number is very big or small does not make it substantial. Big and small numbers are difficult to understand without any context. Millions or billions are not part of our everyday experience so it is not easy to judge if they are actually big or not. 

As my SUMOP post makes clear, those particular guidelines would have served the author of that Medium article well. 

Though they would, of course, have deprived him of the ability to put out an attention-grabbing, alarmist story. #PassThePinot



Read archived posts.