The nonpartisan Urban Institute publishes studies, reports, and books on timely topics worthy of public consideration. The views expressed are those of the authors and should not be attributed to the Urban Institute, its trustees, or its funders.
Note: This report is available in its entirety in the Portable Document Format (PDF).
Raw data that nonprofit organizations obtain from their outcome monitoring procedures, no matter how good, need to be processed and analyzed before the information can be useful to managers and staff. This guide, the fifth in the Urban Institute's series on outcome management for nonprofit organizations, describes steps that nonprofit organizations can take in performing this analysis.
This guide is unique in offering suggestions to nonprofits for analyzing regularly collected outcome data. The guide focuses on those basic analysis activities that nearly all programs, whether large or small, can do themselves. It offers straightforward, common-sense suggestions.
Probably the major concern, for at least some small organizations, is the computer capacity needed to tabulate the numbers that would otherwise be done manually. Fortunately, most groups today have some basic computer capability, and easy-to-use software is readily available to make the calculations. Any problems with programming can likely readily be solved through outside technical assistance, which is easy to secure in any community.
This guide does not deal with the more complex analysis procedures that involve sophisticated statistical or mathematical knowledge. Those procedures are likely to be most feasible only when resources are available for in-depth studies.
The steps described here should be of major help in interpreting the collected outcome data and in making the data useful for decisionmakingthe ultimate purpose of any outcome measurement process.
Gordon W. Green
Vice President, Research
Any organization with an outcome measurement system will quickly accumulate a variety of data. Before those raw data can be used to help improve services, they need to be converted into useable information. This process is called analysis. Analysis is not a mysterious, mystical activity. Analyzing data is a normal part of life for almost everyone. The tasks involved are remarkably similar, no matter who is doing the analysis or what is being analyzed.
Analysis of quantitative data includes adding, subtracting, multiplying, dividing, or other calculations. However, it is also much more. Analysis requires human judgment. The combination of calculations and judgment often produces the best analysis. Analysis is as much about thinking as it is about calculating.
This guide suggests ways to extract information from outcome datathrough analysiswith the goal of using the analysis to help improve services for clients and to ensure better outcomes in the future. A separate guide in this series provides suggestions on how to effectively use the analyzed outcome information.
The steps presented here assume that the nonprofit organization has already selected outcome indicators and is collecting information regularly, as discussed in other guides.1 Once outcome data have been collected, they need to be turned into useful information.
These analysis procedures can be used each time the outcome data become available, whether monthly, quarterly, annually, or at whatever reporting interval the program uses. Done regularly, the analysis will provide the organization and its programs with a steady stream of key information about clients and results.
Little technical background and only a basic knowledge of mathematics or statistics are required. More complex analytical procedures can provide more sophisticated and in-depth analyses, but these are beyond the scope of this guide. Such advanced procedures are likely more useful for special studies, or to help with large-scale program decisions, than for regular outcome-monitoring purposes. (The fourth section of this report briefly discusses some such special analyses that a nonprofit organization might attempt, when and if appropriate.)
The set of analysis steps described in this guide is listed in exhibit 1. These steps are recommended for each reporting period. An organization with multiple programs should establish separate analysis procedures for each program. A computer (rather than staff) can be set up to do most of the computations. A nonprofit organization new to outcome management and the analysis of outcome data might choose to focus initially on only a few of the procedures described here. As the organization becomes more familiar with outcome data, more questions will arise about the outcomes measured, and additional analysis will become increasingly useful.
This guide is more detailed and technical than others in the series. However, most of these techniques are really common sense.
Nonprofit organizations provide a wide range of diverse services. Inevitably, differences will exist in how analysis procedures are applied, from program to program and from organization to organization. The basics, however, are likely to be applicable to most programs and most outcome indicators. The steps can be used if the program has 25 clients, hundreds of clients, or thousands of clients.
Steps for Analyzing
Program Outcome Data
|Section I: Begin with the Basics
Step 1. Calculate overall outcomes for all clients
Step 2. Compare the latest overall outcomes with outcomes from previous time periods
Step 3. Compare the latest overall outcomes with pre-established targets
Step 4. Compare the latest overall outcomes with clients in other, similar programsand to any outside standards
Section II: Delve Deeper into Client and Service Characteristics
Step 5. Break out and compare client outcomes by demographic group
Step 6. Break out and compare outcomes by service characteristics
Step 7. Compare the latest outcomes for each breakout group with the outcomes from previous reporting periods and with targets
Step 8. Examine findings across outcome indicators
Section III: Make Sense of the Numbers
Step 9. Identify which numbers should be highlighted
Step 10. Seek explanations and communicate the findings
Who Should Perform the Analysis
Who performs the analysis will depend on resources and preferences. Here are some guidelines:
- Direct service providers (such as caseworkers) should not be given the added burden of data analysis. If the direct service provider is interested, however, s/he should be encouraged to examine the relevant outcome data. As discussed later, direct service providers should be a major information source for explanations of key findings and for help in interpreting the data.
- If resources are available, someone, full-time or part-time, can be assigned responsibility for much of the analysis work.
- Program managers are likely the most important people in the analysis process. They should take time to examine the findings, identify highlights and issues, and seek explanations for unexpected or disappointing outcomes.
- In very small organizations, one manager may assume the responsibility for examining and interpreting the outcome data. If the manager needs assistance with particular tasks, such as performing some of specific procedures described in this guide, the manager should seek help, perhaps from a volunteer, a local college or university, or a consultant.
About This Guidebook
This guide presents 10 basic steps to turning outcome measurement data into useful findings, procedures that should be used on a regular basis. The steps are grouped into three sections. A fourth section covers more in-depth analyses for use in special situations, and a fifth covers a few other points about analysis.
SECTION I: Begin with the Basics describes the first steps in assembling outcome measurement data into a format for analysis and completing some initial comparisons.
SECTION II: Delve Deeper into Client and Service Characteristics covers procedures to break out the results by client demographic groupings and by program type, and to review groups of indicators.
SECTION III: Make Sense of the Numbers includes identifying and highlighting the key issues that the examination uncovered and then seeking explanations for unexpected results.
SECTION IV: Special Analyses Using Outcome Information briefly identifies some special procedures that may be useful in further mining outcome information.
SECTION V: Final Points about Analysis lists some limitations that analysis users should be aware of and summarizes the benefits of analysis.
Note: This report is available in its entirety in the Portable Document Format (PDF).
1 See the series on outcome management for nonprofit organizations published by the Urban Institute in 2003. Another excellent source of information is Measuring Program Outcomes: A Practical Approach
, published in 1996 by the United Way of America, Alexandria, VA.
This guide was written by Harry P. Hatry and Jake Cowan of the Urban Institute and Michael Hendricks, an evaluation consultant.
The guide benefited considerably from suggestions from Meg Plantz, United Way of America; Thomas Smart, Boys & Girls Club of America; Ken Weiner, Professor of Mathematics, Montgomery College (MD); and Linda Lampkin, Urban Institute.
The editors of the series are Harry P. Hatry and Linda Lampkin. We are grateful to the David and Lucile Packard Foundation for its support.