R&D: The gold standard for evaluating comprehensive community initiatives
The HOST Demonstration is a place-based initiative testing an innovative approach to coordinating services for vulnerable youth and adults who live in public and assisted housing in four cities.
The things that make HOST exciting as a service initiative—being place-based and testing a range of whole family service approaches also make designing a credible and rigorous evaluation challenging. HOST does not fit the model of many evidence-based social service interventions; it is not a standardized program or curriculum that specially trained providers deliver. Rather, as our most recent blog post describes, it is a really a research and development (R&D) project, intended to fit the needs and dynamics of the specific residents who live in the four participating communities.
For HOST, the Urban Institute team provides a standard framework of for resources, activities, and desired outcomes as well as a set of mandatory components: intensive case management, low caseloads, employment and clinical services for adults, and innovative approaches for reaching children and youth. Beyond that, each of the four sites designs its own service model and uses its own approaches for outreach, programming, and youth interventions. The Chicago site is using a rewards model to encourage youth and parents to set goals and recognize achievements. The Portland site is developing culturally specific approaches for support groups and youth activities that will appeal to a largely immigrant population. And in D.C., which has extremely high rates of HIV and teen pregnancy, the youth programming focuses on sexual health and safety.
A “gold standard” randomized control trial evaluation is not appropriate—or even possible—for a complex, community-based program like HOST that is intentionally trying to develop creative new approaches. A better model for HOST—and for other comprehensive community initiatives—is an R&D, a rigorous implementation and outcome study that maximizes what we can learn from these promising and creative interventions.
For HOST, we collect administrative data from the sites, conduct quarterly site visits, and hold bi-weekly calls to monitor program activities and provide regular feedback. We also conduct focus groups with residents and twice-yearly interviews with program administrators and staff and are conducting a long-term outcome study with survey and administrative data.
The feedback from the research team has allowed the sites to continuously improve and refine their service models—and, we hope, ensure that they are delivering the most effective interventions possible. Some of the changes that have come about because of the research-practice partnership are improvements in coordination between service teams working with youth and those working with adults; lowered caseload ratios; adding more clinical staff; and innovations in youth outreach and engagement so that the programs can work effectively with children ranging in age from elementary to high school. Focus groups suggested that families in Portland needed more targeted youth programming; findings from the baseline survey highlighted the need for lower caseloads and more mental health services. Staff interviews in both sites showed that lower caseloads meant more strain on staff who had the time to uncover deep problems and indicated that staff required more emotional support.
While our research strategy may not be the Cadillac model of evaluation, we like to think of it as the innovative new hybrid car on the market that everyone has been waiting for – something that might change the field of social science research and make it more relevant for policy and practice.