Urban Wire When you don't like the results, criticize the methodology
Margery Austin Turner
Display Date

Media Name: defaultfeature.png

Dylan Matthews of the Washington Post produced a terrific flow chart last week showing "how to argue with research you don't like." I think I've heard all the critiques he offered up—and used a few myself. My favorite: "I urge you to tell the thousands this program has helped that it has failed to make their lives better."


After I stopped laughing, I was struck by how vividly the flow chart reflects Washington's current love affair with experimental studies: random control trials. I love this methodology too. It's often the best way to find out whether a program really works—whether it produces the outcomes it promises, and at what cost.

But "does it work?" isn't the only question that research can help policymakers answer. And some very interesting and promising programs aren't amenable to controlled experimental designs. Depending on the question being asked, we should consider the "gold standard" methodology to be the one that assembles reliable data, applies the right tools, and is interpreted with intellectual honesty. For example:

  • microsimulation models (like the Urban-Brookings tax policy model) can forecast outcomes under a wide range of “what if” scenarios;
  • administrative data from public agencies can be systematically linked and analyzed to answer questions about program design and implementation; and
  • sometimes, fully diagnosing a complex problem, designing an innovative solution, or understanding exactly how a program should be implemented requires more nuanced, qualitative information gathered through in-person observation, in-depth interviews, or focus groups.

Instead of defaulting to a single tool, policymakers and practitioners should look to researchers to draw from a portfolio of tools. And smart reporters should be asking whether we've chosen the right methodology for the question we're trying to answer. Using the wrong tool can produce misleading information or fail to answer the questions that matter most when a decision is being made.

Flowchart from Dylan Matthews of the Washington Post.

Research Areas Economic mobility and inequality