Community-Engaged Approaches to Evaluating a Collective Impact Effort

Research Report

Community-Engaged Approaches to Evaluating a Collective Impact Effort

Experiences Evaluating Family-Centered Community Change

Abstract

The Family-Centered Community Change (FCCC) effort, launched by the Annie E. Casey Foundation, supported local service partnerships over eight years (2012–19) in three neighborhoods with high poverty rates: one each in Buffalo, New York; Columbus, Ohio; and San Antonio, Texas. These efforts sought to develop more integrated sets of services to help adults and children succeed together in a “two-generation approach.” The Foundation took what it termed a “strategic coinvestor” approach to FCCC, building on existing community-change efforts with flexible technical assistance and other local partnership-directed supports. The Urban Institute and two other evaluation firms conducted different evaluation activities around the effort.

Casey’s strategic coinvestor approach to FCCC reflected the Foundation’s priority to bolster the local partnerships’ decisionmaking power in every aspect of FCCC, including evaluation activities. The expectations and goals for involving local stakeholders in the evaluation activities were modest at the beginning—chiefly, to ensure the activities were workable and beneficial to the local partnerships—but grew over time. Over the course of FCCC, the field of community-engaged evaluation methods also grew and developed, and the Foundation increasingly viewed these methods as key to its efforts to support racial and ethnic equity and inclusion (REEI) in all facets of its work.

As these expectations grew, the evaluators employed an expanding set of community-engaged evaluation methods (CEM). We define “community” in FCCC as program participants and staff involved in administering the programs. CEM in FCCC included engaging the local partnership staff in planning data collection and reviewing products. In the latter half of the effort, it included engaging these staff, along with program participants, in interpreting preliminary evaluation findings during engaging community events called Data Walks. The Foundation also invited a select few program participants to learn about evaluation activities at FCCC stakeholder convenings beginning in 2016.

CEM exists on a continuum from simply informing and consulting with community stakeholders about evaluation activities to developing strong partnerships with community members and empowering them to make final decisions about evaluation design and execution. Most CEM in FCCC sat at the lower end of this continuum, with evaluators informing or consulting with community members. In most cases, evaluators engaged partnership staff rather than program participants.

 During interviews with the Urban Institute team about their experiences applying CEM in FCCC, partnership staff, Foundation staff, evaluators, and other stakeholders all agreed that CEM added value to the FCCC evaluations and to community members. There was widespread agreement that Data Walks were empowering experiences for FCCC program participants. Evaluators felt that feedback from these participants added nuance to evaluation findings. Evaluators also felt that soliciting ongoing feedback from partnership staff on products improved the accuracy of evaluation findings and helped build stronger relationships with these people.

However, despite this added value, stakeholders pinpointed challenges involved in using CEM. Program participants were only engaged at limited touchpoints and only in the later years of the evaluation. Evaluators and some Foundation staff felt that parents had been engaged in a tokenizing way. Stakeholders also noted that involving partnership staff in reviewing products took additional time they often needed to perform their core responsibilities to serve families. Evaluators also emphasized that involving these staff in planning evaluation activities limited the range of activities that stakeholders agreed on and ultimately meant the evaluation design did not include an outcomes study.

Despite these limitations, the FCCC evaluation methods included substantially more community engagement than external evaluations typically do. This effort was only possible because of a rare level of sustained support from the Foundation. The FCCC experience with CEM offers several lessons for incorporating these methods in future evaluations of community-change efforts that focus on shared decisionmaking , including the following:

  1. Take a community-engaged approach in selecting program and evaluation grantees. This includes selecting evaluators with cultural competencies necessary to gain community members’ trust.
  2. Ensure all stakeholders share a common understanding of and commitment to community engagement in both the evaluation and programming throughout all phases of the work.
  3. Foster a safe space for feedback by making clear commitments about how shared decisionmaking will work and following through on them.
  4. Work to build relationships before beginning evaluation tasks and throughout the engagement.
  5. Ensure local and outside stakeholders have the knowledge and skills to foster engagement and partnership.
  6. Set initial community engagement goals and priorities up front as well as a process to update those goals and priorities over time.
  7. Establish a commitment from the funder at the outset to adequately fund community engagement in effort design and evaluation.
  8. Appropriately compensate participants for their contributions to the community-engaged evaluation and minimize participation burden.

These lessons are timely with increasing national recognition of the need to incorporate perspectives of those with lived experience in decisionmaking about programming affecting their lives.

Cross-Center Initiative

To reuse content from Urban Institute, visit copyright.com, search for the publications, choose from a list of licenses, and complete the transaction.