I’m a researcher and a data nerd. (My coworkers call me McData.) But I’m also a communications professional, meaning I have the challenging job of making technical and jargon-heavy empirical research consumable for a general audience.
So I was surprised to see Benjamin Wallace-Wells offer two (gentle) critiques about qualities that I actually view as the site’s profound strengths. A quick look at one of the launch articles— on the value of toilet-seat covers, of all things—shows why.
The article does a lot of things well. First, it’s a highly engaging read about toilets (and tagged pithily under “bathroom safety”), meaning it’s accomplished a key goal of modern journalism: to attract clicks and eyeballs. Second, it addresses a real and substantive public policy need: do toilet seat covers reduce illness? (As Wallace-Wells says, “good question!”) Third, it sets up a potentially valuable cost-benefit framework for medical care providers and private-market providers of toilet seats and their covers.
In just 600 words, it does all of those things, plus review the academic literature. And yet Wallace-Wells was disappointed that it ultimately “threw up its hands” and concluded that the data just weren’t available to put the question to bed.
I can’t state strongly enough how important and valuable I think it is for researchers and journalists to articulate clearly the limitations of their data and analysis.
I fear (and have observed anecdotally) that, among general audiences, articles that incorporate data and throw out some statistical terms tend to acquire a veneer of authenticity. Oh, they have data, found some correlations, so now we know something.
Good social science research produces averages or estimates of outcomes that are presented alongside confidence intervals; we’re rarely sure of anything. (And comparatively little research reliably addresses the causality question.) What’s more, for many good and important questions we simply lack the data, the methods, or the resources necessary to produce a good estimate.
Does that mean social science isn’t useful? Not at all. FiveThirtyEight has concisely summarized what we do and don’t know about toilet seat covers and illness, while clearly identifying where the private market could invest in more research, potentially saving money and/or improving lives.
In the grand scheme of public policy, perhaps toilet seat covers seem a bit trivial. But consider them (and this article) an exemplar for countless topics, and ask what the alternative is: continuing to produce and market covers blindly? Simply assume they have no value, and cease? Making (policy) decisions without any credible evidence of their consequences is guesswork. We can and should do better.
Wallace-Wells also worries that most of FiveThirtyEight’s stories (that is, those published in its first 48 hours) merely “tended to amount to a careful reading of existing academic studies.” Indeed, the toilet seat article is a mini (if effective) literature review. But conducting and interpreting empirical work is a craft and often a nuanced and difficult one.
That said, there is a tremendous amount of relevant and high-quality research underway every day. Academics, lamentably, are notoriously bad at presenting their work in approachable and engaging ways. Far too much valuable research languishes on university and think tank websites, unread and unused.
The flip side is that journalists often struggle to wade through dense methodology sections and appreciate just how precise and rigorous a given study’s findings are (or, often, are not).
So a popular news organization (FiveThirtyEight already has 140,000 Twitter followers) that deeply gets empirical research and spins it every day into interesting and useful stories could fill a huge hole.
The toilet seat article is pithy and fun, it addresses a public policy issue, it reviews the literature, and it could benefit private organizations. I wish every good empirical study got the same treatment.
Illustration by Tim Meko of the Urban Institute. Follow Zach McDade on Twitter.