Urban Wire Three ways to unlock the potential of evidence-backed programs at the local level
Connor Burwell
Display Date

Media Name: 20180724localevidence-2-header.jpg

Researchers and local practitioners are like waiters and customers at a restaurant. If a waiter brings out food that the customer didn’t order, the customer will be confused. In the same way, practitioners receiving evidence from researchers need to help decide what studies should address. Practitioners need to be engaged in all phases of research.

John Scianimanico of the Laura and John Arnold Foundation provided this useful analogy at a recent discussion on encouraging the use of evidence at the local level, cohosted by the Urban Institute and the Forum for Youth Investment.

Local practitioners seek to improve their programs in many ways. And although they might choose different paths of influence, all can benefit from evidence about what practices have proven most effective.

The discussion revealed three strategies for researchers and policymakers seeking to increase local implementation of evidence-backed programs and policies.

1. Develop relationships starting at the first research question.

Researchers often try to encourage local practitioners to participate in a study by focusing on selling the technical rigor of their experiments and findings. Vivian Tseng of the William T. Grant Foundation said this approach might not prove effective.

Researchers should instead engage practitioners and stakeholders at the early stages of their investigations so that program administrators, staff, and others in the community can see how the research questions are relevant to their missions. Tseng elaborated that building “trusting relationships between people and between organizations” has more power to influence practice decisions than a well-researched report landing on the desk of the director of a nonprofit organization.

Ruth Neild, director of the Philadelphia Education Research Consortium, advised that researchers shouldn’t start conversations by asking, “What research do you want us to do?” Instead, they need to ask local practitioners what their priorities are and start a dialogue about how research can inform that agenda.

2. Translate research into understandable language.

Local practitioners, especially those working with marginalized populations or communities, can feel disconnected from researchers who use complex, difficult-to-grasp terminology and who often seem removed from people coping with trauma.

Lili Elkins of Roca Inc., an organization devoted to keeping young people out of the criminal justice system and out of poverty, recounted an experience when researchers demonstrated a “complete inability to translate research language—for lack of a better expression—into English.” Matthew Billings, a leader within the Providence Children and Youth Cabinet, echoed these communications challenges between practitioners and researchers. “Research language is tough,” he said. “Who’s doing that translation?”

(From left to right) Lauren Eyster, Senior Fellow at the Urban Institute, leads the panel "Bridging the Gap between Research, Policy, and Practice: Practitioner Perspectives" with Ruth Neild, Director of the Philadelphia Education Research Consortium, Mat

(From left to right) Lauren Eyster, Senior Fellow at the Urban Institute, leads a panel discussion with Ruth Neild, Director of the Philadelphia Education Research Consortium, Matthew Billings, Deputy Director of the Providence Children and Youth Cabinet, and Lili Elkins, Chief Strategy Officer at Roca Inc., during an event at the Urban Institute on Tuesday, July 24, 2018. Photo by Maura Friedman/Urban Institute.

Translation is the key. Framing research questions and findings using shared terminology is critical to getting community buy-in when it’s time to put the evidence into practice.

3. Seek evidence in areas most relevant to programs.

The federal government and foundations often make program funding contingent on participating in a formal evaluation or only fund approaches that are already backed by evidence. But these well-intentioned restrictions can create tension with nonprofit practitioners for various reasons:

  • There might be limited or no evidence on what works to support activities and services or populations in a particular program. Or if studies exist, they may cover programs in geographically different areas (e.g., urban rather than rural communities).
  • A program’s staff might not have enough time to take on added evaluation responsibilities.
  • Participating in randomized controlled trials might limit people’s access to services, at least for a while, which might be uncomfortable for staff.

When practitioners feel evaluation activities could be overwhelming, if they feel forced to change their model to “fit into a very specific box,” as Lili Elkins explained, they might opt out of evaluation. “Program leaders may simply go with what seems to be working and not take the time to participate and work with data collection that's required with rigorous research partners,” explained Urban Institute president Sarah Rosen Wartell.

To overcome these hurdles, researchers need to support practitioners as they attempt to meet evidence-based programming requirements. Researchers also must make sure evidence and findings are translated into operational terms relevant to the program at hand. Finally, when initiating new demonstrations and evaluations, researchers must engage practitioners from the beginning to address the questions and issues that will most help programs. Expanding the development of researcher-practitioner partnerships can amplify the impact of programs in communities.

Research Areas Nonprofits and philanthropy
Tags Evidence-based policy capacity Federal evaluation forum