Easier access to nationally available raw data and wider interest in making this data usable for different audiences has led to growth in the number of online health and community indicator platforms. Each platform seeks to engage its audiences with data through maps, charts, and tables. These platforms offer a variety indicators on health outcomes and behaviors as well as indicators on social and economic factors, physical environment, and demographics. In addition to data, many of the sites offer various features and services, such as case studies, reports, policy suggestions, and technical assistance. While the general motivations of these sites are similar, each one offers slightly different perspectives, indicators, tools, and frameworks to meet their intended audiences’ needs.
County Health Rankings & Roadmaps (CHR&R) is an example of these platforms, offering a variety of county level data, tools, features and services for local community stakeholders and leaders. A partnership of the University of Wisconsin Population Health Institute and the Robert Wood Johnson Foundation (RWJF), CHR&R is used throughout the public health field and also has a growing presence outside of traditional health sectors. CHR&R’s County Health Rankings informs communities on their health status and the broad range of factors that influence length and quality of life. The Rankings also provide context on how the health of a given county relates to levels in other counties within the state. The Roadmaps aspect of CHR&R provides evidence-based interventions and other tools for communities to better understand and utilize the Rankings data to improve health outcomes and the opportunities to be healthy. In addition, CHR&R provides coaching services for local communities on their health improvement journey. In providing these resources, CHR&R aims to bring together individuals from different facets of a community to collaborate and make a long-term impact on health. For more information about the development and history of CHR&R, see Catlin (2014).
CHR&R and RWJF are beginning a strategic planning process to envision the future direction of CHR&R. To inform this planning process, the foundation contracted with the Urban Institute to conduct an examination of similar websites that provided data and other related content. The charge was to describe the current landscape of online health measurement platforms and identify potential ways to more thoughtfullycollaborate with other organizations to make progress improving the health of local communities.
We conducted our research over three months from February 2016 to May 2016 in collaboration with RWJF and CHRR. Our review focused on two data collection methods: (1) a website scan of 15 online platforms that focus on disseminating data on health and social determinants of health and (2) telephone interviews with site managers of five different online platforms.
The Urban Institute drafted a preliminary list of 24 potential websites for review featuring health indicators and social determinants of health (see appendix A). Both RWJF and CHRR provided input in the selection process to help narrow our selection to the 15 websites for the website scan. In narrowing our selection, we selected sites that had relevant topics and indicators; that focused on a variety of geographic levels for their data and audiences; and that had potential synergy with CHRR. Below are the fifteen websites included in our review.
- AARP Livability Index
- America’s Health Rankings
- Community Commons
- Community Health Status Indicators
- CPD Maps (HUD Exchange)
- Diversity Data
- Gallup-Healthways Well-Being Index
- HealthLandscape (American Academy of Family Physicians)
- Kansas Health Matters (as an example of the Healthy Communities Institute platform)
- KIDS COUNT Data Center
- Measure of America (including American Human Development Index)
- National Equity Atlas
- Opportunity Index
- Why not the best.org
The website scan documented each website’s purpose and audience, the types of data tools offered on the site, features and services offered, and the sites’ communications methods. A list of the websites we reviewed and the corresponding links to access the website is included as appendix A. Some fields were closed with simple yes/no responses while others offered the reviewer the opportunity to comment or provide additional context. The summary and detailed tables for each site are included in the supplemental workbook.
We also identified five of the websites from the website scan to participate in more detailed telephone interviews. The five websites were selected based on input from RWJF and CHRR. Listed below are the organizations and the corresponding websites for those individuals that participated in telephone interviews.
- Annie E. Casey Foundation, KIDS COUNT Data Center
- Measure of America, Mapping the Measure of America
- Institute for People Place and Possibility, Community Commons
- PolicyLink, National Equity Atlas
- Healthways, Gallup-Healthways Well-Being Index
The interviews focused on topics from the website scan and included additional questions about collaboration and evaluation. Each interview lasted between 45 minutes and an hour.
All figures in this memo with website images are hyperlinked to the corresponding web address where the image was created. The one exception is Figure 6, which relies on flash and does not have a unique web address.
Purpose and Audience
All the sites included in our review aim to mobilize and empower their audiences to use data to advocate for policies and investments or make informed decisions, most at the state or local level. Broadly, the purposes of the 15 websites fit into four different categories:
- Explicit health or health care focus: America’s Health Rankings, Community Health Status Indicators, Healthy Communities Institute, Health Landscapes, and Why Not the Best
- Explicit racial equity frame: Diversity Data and National Equity Atlas
- Index of well-being or livability: AARP Livability Index, Gallup-Healthways Well-Being Index, KIDS COUNT Data Center, Measure of America and Opportunity Index,
- Broad resource for indicators for local action: Community Commons, CPD Maps, and PolicyMap
These categories describe the overall frame for the websites, but all of them have data related to health and data that could be used to advance community health or equity.
A subcategory of sites we reviewed were sites focused on specific subpopulations, like Annie E. Casey’s KIDS COUNT Data Center, which focuses on providing users information on child well-being, or the AARP’s Livability Index, focused on livable cities for older adults. These two sites were not completely analogous. The AARP Livability Index, while designed to capture livability measures that affect older people, has data and measurements that affect users of any age. On the other hand, the KIDS COUNT Data Center solely focused on children and curated their data, indices, and data tools to describe that subpopulation.
Nine of the 15 sites had a conceptual framework for the selection of indicators. The frameworks were portrayed through the creation of indices and rankings or as theories that informed the indicator selection. As an example, the survey that supports the Gallup-Healthways Well-Being Index asks respondents about several factors of well-being. The Index combines the results of each response into one measure of wellbeing. These data are also accessible on a companion website known as Gallup Analytics (analytics.gallup.com), on which users can make health and well-being comparisons to over 160 countries. As another example, Community Health Status Indicators uses Evans and Stoddart’s framework that promotes healthy communities through the lens of social and physical factors that can influence health behaviors. Using this framework, they focus on health outcomes of mortality and morbidity and the factors that can affect them, such as health care access and quality, health behaviors, social factors, and physical environments.
In addition to these nine sites with an explicit framework that guide indicator selection, two of the respondents we spoke with from Community Commons and KIDS COUNT Data Center also mentioned that the indicators included in their sites represent deliberate choices about the key elements necessary to describe a community. Furthermore, Community Commons has a theory of change posted about how their site intends to affect change.
Each site serves an audience consistent with its organizational mission. Some of the stated audiences are purposefully broad to be inclusive of many users (community change advocates, equity advocates, policymakers, and community members) while others are focused narrowly on a specific audience (health professionals, hospital administrators, and selected US Department of Housing and Urban Development grantees). Some also listed technical professionals like journalists, researchers, or philanthropists among their intended audiences. All the sites we reviewed had some or all of their features freely available to the public and accessible by audiences that were not explicitly stated. (Some features of PolicyMap are only available to subscribers, and some Community Commons hubs are restricted to certain audiences.)
Health Landscapes and Why Not the Best are the only two websites that we examined where health care professionals and hospital administrators are the primary stated audiences. Their focus on health care professionals as a primary audience leads to selecting data about administration of clinical care and disease prevalence rather than health outcomes and social determinants of health. Subsequently, data and information from these sites is less relevant to broader audiences.
Based on our review, two-thirds (10 of the 15 reviewed) mentioned policymakers or policy leaders as intended audiences for their sites. In addition, 3 of the 15 sites mention community members explicitly as an audience for their sites: Community Health Status Indicators, Healthy Communities Institute’s Kansas Health Matters, and AARP Livability Index. It is likely other sites also consider individuals as part of their audiences too. For example, Community Commons, National Equity Atlas, and the Gallup-Healthways Well- Being Index serve users working for community change, including those working outside of professional roles.
Six of the 15 websites are relatively new and were launched in 2010 or later. Some websites have a much longer history like America’s Health Rankings, KIDS COUNT, and Community Health Status Indicators. The legacy of the data provided by these organizations preceded the formation of their websites and was disseminated through annual reports (e.g., America Health Ranking’s Annual report KIDS COUNT Data Book, and Community Health Status Indicators hard copy report) and before the creation of their online platforms.
Slightly more than a quarter (4 of the 15 reviewed) of the websites in the scan required registration. Of our sites interviewed, Community Commons does require registration, and Measure of America requires people to share their e-mails if they want to download the underlying data. Registration facilitates knowing who is using the site and offers an opportunity to reach out to users for evaluation, though others view it as too much of a barrier for potential users.
Each site provides data tools and data visualizations for its users. The site’s target audience plays a role in the types and level of customizability of the tools. Some websites guide users to predesigned data visualizations and offer minimal ability to customize views (e.g., Opportunity Index, Measure of America, National Equity Atlas, and Healthy Communities Institute). These seem well suited to users without technical expertise in analyzing data. Other websites allow greater flexibility, enabling users to create custom maps, charts, and tabulations based on their needs (e.g., Community Commons, KIDS COUNT Data Center, PolicyMap, Health Landscapes, and Why Not the Best).
These do not have to be mutually exclusive if there are different paths to explore the website. For example, Community Commons offers fairly extensive customization, but the website manager reported efforts to add more curated content to help nontechnical users first engage with data through a story and then dive into the more advanced features. Community Commons’ “Access Economic Data,” “Physical Environment Data,” “Equity Data Report” and “Food Environment Report” all provide users the opportunity to run geographic-specific reports that help explain the importance of the indicators listed in each report. Each report has indicators broken down into the subcategories Demographics, Social and Economic Factors, Physical Environment, Health Behaviors, and Health Outcomes for each of the given topics. Figure 1 shows an excerpt of the Food Environment Report for the District of Columbia, listing the “Food Insecurity Rate” and “Food Insecurity-Food Insecure Children.”
The report offers users valuable information about their local food environment without requiring knowledge of which indicators are relevant to this topic or how to build each visual with the site tools. The report lists several indicators within each of the sub categories. Examples of additional indicators in the Food Environment Report for the subcategory of Social and Economic Factors included Children Eligible for Free/Reduced Price Lunch and Population Receiving SNAP Benefits. These topic-based reports are an easy introduction for users to review a cluster of factors related to a particular community concern.
Eleven of the 15 sites offered users mapping tools. The sophistication of the maps and level of interactivity varied across the sites. Some maps were static, some allowed limited interactivity, while others let users import data to create their own customized maps. Sites that had integrated their framework within the mapping instruments were much easier to navigate. Figure 2 shows the Opportunity Index mapping feature. It is a good example of a map with some limited interactivity and an integrated framework that provides insightful information on a selected geography without being overwhelming. The overall score as well as each contributing component of is clearly reported along the bottom of the screen. Additionally, users can dive into each individual indicator that contributes to the framework using the drop-down box (highlighted with an orange oval). Figure 3 shows the Diversity Data map of the unemployment rate by metropolitan area. It is a good example of a static map with no interactivity. Users can hover over each metro to view the specific unemployment rate, but they are unable to zoom to various geographies or click on map items to investigate further information about the conditions contributing to unemployment.
Interactive Graphs and Charts
Just about half of the sites we reviewed offered charts and graphs as a way to visualize the data. Most graphs and charts offered minimal customization. User interactivity was generally limited to adding geographic selections for comparisons or hovering over bars or lines to show the indicator value. Measure of America and National Equity Atlas were two of the more robust websites for interactive graphs and charts. They offer intuitive interfaces to create comparisons across geographies and indicators, adding measures of interest to the user. Figure 4 demonstrates Measure of America’s charting capabilities. As you can see in the upper left-hand corner of the image, users can select various indicators and geographies through a drop-down box, offering easy-to-use customization. Users can also sort and organize the data by different indicators.
Trends over Time
Ten of the 15 sites had tools that allowed users to see trends over time for some or all of their indicators. Trends were most often charted in time series graphs based on a user’s indicator selection. Some sites offered custom reports that presented the data across all indicators for different user-selected years. The advantage of the custom report is that the user is not restricted to observing trends in one indicator at a time. Users can spot broader trends across many indicators.
Each website in our review approached their data content in slightly different ways. They had different indicators, different ways of organizing them, and different geographies for their data. Some websites included indicators that could be broken down by various subpopulations, like race/ethnicity, age, sex, nativity, and household composition; others did not. The primary geographies for each website varied, but almost all focused on data below the state level.
All of the sites except one shared their metadata to help users understand the meaning and sources of the indicators. Most indicator definitions were easily accessible, clearly written, and included the date of the source data. The one exception stored the metadata as a part of their methodology, which made it more difficult to find since it was not directly integrated in their platform.
The content of each website’s data varies based on its mission and audience. However, the indicators generally fell into just a few topic areas: (1) health outcomes, (2) health behaviors, (3) clinical care, (4) social and economic, (5) physical environment, and (6) demographics. Figure 6 shows a summary of the various broad topics covered by each website. Social and economic factors were the most widely used—in all but one website. This is not surprising since our review focused health platforms and the community indicator platforms, and social and economic factors are relevant in both of these approaches. Few of the sites reviewed offered indicators on clinical care and health behavior. This could reflect the difficulty of accessing public aggregate indicators on these topics, or that our website selection does not adequately capture the sites available with this kind of data.
We noted a group of websites in the earlier section that offer unique data sources or indices of wellbeing. In these cases, the index structure facilitated user’s exploration of the data. The Gallup-Healthways Well-Being Index survey in particular is notable because it was the only example of indicators of people’s perceptions of health and well-being factors. It also provides data that is more recent than most of the nationally available sources.
In our discussions with site managers, they indicated that changes to data content originate in multiple ways. Four of the website managers mentioned they consider user feedback from comments submitted through general information e-mail addresses and from webinars and presentations as mechanisms to improve data content. Three of the respondents we spoke to (Community Commons, National Equity Atlas, and Measure of America) indicated that funding considerations were another contributing factor to the development of new content. The Annie E. Casey staff also consult a data work group within the Kids Count steering committee about new content.
During our interviews, staff from four of the five websites suggested that users are seeking more finegrained data about various subpopulations. Many sites already include indicators broken down by subpopulation. Ten of the 15 sites offered users some indicators for subpopulations; all of them had some indicators broken down by race and ethnicity. Seven of the sites we reviewed also provided some indicators by age; five of the sites provided some indicators breakdowns by gender. Some of the sites provided an easy structure to find the subpopulation breakdowns, like the KIDS COUNT Data Center. Figure 5 shows how the KIDS COUNT Data Center presents their subpopulation selection. The subpopulation selection options are clear and easy to navigate for all users. The subpopulation indicators in other sites were difficult to identify unless the user knew what to look for.
All but two of the sites that were reviewed had data for geographies smaller than the state level. Eleven websites had county-level data. Tract-level data was less common, available in only four of the 15 sites. Tract-level subpopulation data was even rarer. Only Community Commons and Policy Map offered data for subpopulations at the tract-level for selected indicators.
Features and Services
Features and services offer users additional ways ask questions; connect with the data through policy interventions, blog posts, and case studies; and get in-depth help through professional and technical services. The features and services offered by websites can be the difference between users gathering data and information to educate themselves and users acting on the information they’ve learned. With the exception of CPD Maps and the Community Health Status Indicators, the websites that we scanned had some way of supplementing the data and tools, either through reports and blog posts, case studies, policy interventions, or technical and professional services.
Instructions for Use and User Feedback and Questions
All of the sites except CPD Maps had some way of communicating with site administrators should users have a specific question or wish to provide feedback, for example, if there were broken links or problems with data displaying. They were all virtually exclusively “contact us” e-mail addresses. In one of our interviews, the respondent suggested that the “contact us” feature of their website is one of the primary ways they get feedback on things that need to be improved on their site.
Six of the 15 sites included information about policy or program interventions. These interventions were presented as ways to improve outcomes for communities on specific indicators. Some websites (National Equity Atlas, AARP Livability Index, America’s Health Rankings, and HCI’s Kansas Health Matters) have policy solutions directly integrated with their data tools with each indicator linking to policy solutions. Others (Opportunity Index, Why Not the Best, and hubs like Salud America within Community Commons) offer policy solutions as separate sections of their website. Figure 8 shows how HCI’s Kansas Health Matters presents policy solutions as a part of their integrated approach. The figure shows Allen County’s teen birth rate compared with the state overall and in the upper right-hand corner, you can see the suggested policy solutions to help reduce Allen County’s Teen Birth Rate.
Three of the sites linked to outside organizations that focus on improving outcomes for a given indicator, or specific research supporting interventions that have been successful in other places. National Equity Atlas’ measure on disconnected youth provides links to several resources, including the Promise Neighborhoods Institute, research from the Center for American Progress, CLASP, Education Opportunity Network, and Brookings. Each provides a different evidence-based practice to help reduce the number of disconnected youth in your community. For example, policy intervention “Develop comprehensive youth employment systems across the education, juvenile justice, and child welfare systems” cites to a CLASP report on the subject by Sara Hastings, Rhonda Tsoi-A-Fatt, and Linda Harris.
Ten of the 15 websites offered reports and analysis featuring data from their site. Blog posts were the most common format for this type of content, whether produced by staff, subject matter expert, or posts by the site’s user community. Posts were typically shorter items targeted to relevant topics or specific uses of the data. The most common topics included new data points, deep dives into particular topics, approaches to different analyses, and new research from other sources. Whether through their blogs or another format, six of the 15 sites provided case studies focusing on specific communities or interventions. These case studies were sometimes a product of partnerships with local communities, as discussed below in the section on collaboration.
Four of the sites that we reviewed (Healthy Communities Institute, Health Landscapes, PolicyMap, and Community Commons) offered some kind of technical assistance or professional services to users. The services they typically offered included further customization of the data platforms, strategic planning or access to additional tools. As an example, Healthy Communities Institute sells a platform that helps states and local public health departments design data mapping and indicator tracking for specific measure. From the information we could gather, these services involved a separate cost for users.
Communications and Dissemination
Most (11 of 15) of the websites that we reviewed offered a newsletter. Most of the newsletters offered information directly related to the platform or data instrument itself. In other cases, the newsletters were more broadly directed at users of the general site, not necessarily the users of the data tool. Newsletter subjects included news on initiatives, updates on data and advocacy for specific issues, topics or campaigns. The frequency of newsletters varied; some were once a week, and some were not received in the time period of the study.
All site managers that we interviewed described using social media to reach their audiences. Sites that have annual data or report releases also focused on traditional media and press releases to help spread the word. Site staff members also disseminated information through presentations at conferences, advertising in conference materials, and traditional advertising in print media. The KIDS COUNT Data Center also uses “national outreach partners” to help amplify the communication around release events. These partners receive advance notice about a release and are asked to help communicate the release to their mailing lists, social media followers, and contacts. (This may be analogous to CHRR Community Engagement Partners.)
Webinars provided an additional mechanism for communicating with users. Three of our interview respondents cited them as mechanisms they use for communication. Two of our interview respondents, Community Commons and National Equity Atlas, mentioned regularly surveying webinar participants to get feedback on the presentation and whether the information was useful. Some of the sites from our website scan posted old webinars, but the content of the webinars was not reviewed as a part of our scan.
Five sites from the website scan had a major annual report or data release. The sites that we spoke with about their major report and data releases remarked that usage spiked during the release period, but then tapered off. To try and reenergize and reengage their audiences to help drive additional traffic to the instruments year round, some sites are considering interim releases of data. For example, Annie E. Casey Foundation is planning a Fall campaign for the KIDS COUNT Data Center. Gallup and Healthways releases periodic snapshots of the survey data with graphics that are posted to the Well-Being Index and Gallup.com websites.
The site managers in our interviews described a range of collaborations. Some arrangements were formal and others informal; some involved an exchange of money and others did not. Below we describe five types of collaborations: operating the site, providing analysis and engagement for a specific place, providing analysis and engagement for a specific topic, facilitating peer collaboration, and engaging communications partners through direct funding and assistance.
Operating the Site
In three of the five websites that were part of the interviews, two or more organizations participate in developing and maintaining the content and technology of the website (Community Commons, National Equity Atlas, and the Gallup-Healthways Well-Being Index). These sites benefit from multiple perspectives, different audience connections, and complementary skill sets.
National Equity Atlas
- PolicyLink is a national research and action institute advancing economic and social equity. They manage the development of the website and research agenda, the communications, and the engagements with local partners to produce regional equity profiles and policy agendas.
- The Program for Environmental and Regional Equity (PERE) at University of Southern California conducts research and facilitates discussions on issues of environmental justice, regional inclusion, social movement building, and immigrant integration. PERE provides and maintains the data and participates in strategic planning for the Atlas.
- The Institute for People, Place and Possibilities (IP3) is part of the team that runs CommunityCommons.org, which provides next-generation tools and skills to help organizations, communities, and governments use data to aid in wise decisionmaking and more effective results. IP3 houses and nurtures the Commons, keeps network growing, supports the users, and brings the new content to the site.
- University of Missouri Center for Applied Research and Environmental Systems (CARES) integrates technology and information to support decision making. CARES develops the technological aspects of the site.
- Community Initiatives (CI) provides on-the-ground, in-community support for multisector collaboratives and coalitions. CI helps guide strategy and partnership development, supports development of stories, and works with local groups who want to leverage the Commons’
capabilities to advance their specific efforts, including helping them develop Hubs on the website.
Gallup-Healthways Well-Being Index
- Healthways helps clients understand the opportunities for well-being improvement in their populations and helps clients customize solutions to improve population health, such as quality improvement, community-based programs, or provider-directed well-being improvement strategies. Healthways maintains the Well-Being Index website, performs the analysis on the survey data, and blogs about their peer-reviewed research.
- Gallup delivers analytics and advice to help leaders and organizations improve their customer engagement, employee engagement, organizational culture and identity, leadership development, talent-based assessments, entrepreneurship, and well-being. For the Well-Being Index, they conduct the survey and write monthly articles on elements of the Index that are posted on Gallup’s primary website, Gallup.com. They also have a companion site, Gallup Analytics (analytics.gallup.com), where subscribers can and individual measures over time by various demographics, create charts and graphs and make health and well-being comparisons to metropolitan areas, states, and over 160 countries.
Other websites we reviewed rely on contributions from multiple organizations. From our scan, America’s Health Rankings also involves two organizations: United Health Foundation and the American Public Health Association. American Public Health Association helps disseminate the rankings and United Health is the key funder that sets the direction for the rankings. Measure of America, in cooperation with Opportunity Nation, created the Opportunity Index, and Measure of America has calculated the Opportunity Index annually for six years.
The KIDS COUNT Data Center presents another example of collaboration. The Annie E. Casey Foundation contracts with the Population Reference Bureau and Child Trends to prepare the federally supplied data for the website. The network’s 53 state partners upload their local data compiled from state agencies and assist in disseminating the data and reports.
Providing Analysis and Engagement for a Specific Place
A second type of collaboration involved deeper analysis for specific places, reflecting the same interest for localized information as the previously mentioned request for smaller geographic data. For example, the Sonoma County Department of Public Health commissioned Measure of America to look at wellbeing in the county.1 The project included a narrative report with findings and recommendations, an online mapping and charting tool, and onsite engagement with the community advisory groups.
PolicyLink has done several regional equity profiles, which consist of customized reports using data and the overall framework from the online tool.2 The initial ones were funded by the federal Sustainable Communities program, and four others were locally initiated. Their latest initiative is funded by RWJF to do profiles with five health-equity coalitions. The manager of National Equity Atlas noted that these reports are most effective when action levers are identified and the profile amplifies community partners’ plans, including aligning with their timing.
Providing Analysis and Engagement for a Specific Topic
Another type of collaboration interview respondents described focused on specific issue areas. As one example, Measure of America’s partnership with the Opportunity Youth Network focuses on disconnected youth.3 In 2012, Measure of America began to produce data and analysis related to disconnected youth in order to have indicators for the United States similar to those published for OECD countries.4 The Opportunity Youth Network is an active user and promoter of these products. Measure of America staff keeps the network appraised of relevant upcoming reports and new projects, such as one on conditions by race/ethnicity and gender, to help refine the understanding of issues for particular populations. Staff members give keynotes at the annual Opportunity Summits and network webinars to update the network indicator trends, and are available to the network staff for ad hoc help, like assistance for a statistic for a speech. The Opportunity Youth Network also supports Measure of America’s fundraising efforts, but no direct funding is involved in either direction. They decided to take on this work without a specific client because they felt the issue was important and the organization could make a contribution. The mutually beneficial relationship advances the attention and local action on an issue that both groups consider a priority. Other Measure of America partnerships, like the one with Opportunity Nation to create the Opportunity Index involves a more formal arrangement and funding to research the indicators and develop the online tool.
Healthways collaborates with several academic institutions to conduct in-depth research on particular issues. For example, they are working with researchers at Clemson University School of Health Research, who are focused on public policy and social inequities. The team is interested in how unemployment, underemployment, and financial stress affect health and how community conditions (including indicators from County Health Rankings & Roadmaps) may mitigate the negative effects.
Facilitating Peer Collaborations
Community Commons was the only site we reviewed that enabled peer-to-peer collaboration among its users. The Community Commons platform provides users an opportunity to share data, see what other users are creating, and connect with other users who are working on community improvement efforts. The member feed allows you to see what is being posted across the site. Additionally, Community Commons is unique for its 450 Hubs, which are curated groups that allow members to share work. These are forums to help elevate specific issue areas that are included on the Community Commons site. The level of customization and complexity varies; some are a simple space for collaboration, while others require new programming. Some of the hubs are open to the public and are designed to engage a wide range of users. For example, the Salud America hub facilitates collaboration within its diverse national online network, as well as posts of research and policy interventions. Other hubs are for private use by select members, employees, or partners like the American Heart Association.
Engaging Communications Partners through Direct Funding and Assistance
The KIDS COUNT Data Center is unique in that it is tightly connected to the KIDS COUNT state network. The state organizations are funded by the Annie E. Casey Foundation to advocate for programs and policies to improve the lives of low-income children and their families. The Casey staff interviewed noted that “the KIDS COUNT Data Center is just as much theirs as ours.” Casey provides the grantees ad hoc technical assistance, opportunities for peer exchange, and training on topics like data and communications. The members of the network draw from the data center to do their own reports and presentations to their state-focused audiences. The foundation also funds a partnership with Child Trends, in which Child Trends links from its articles to related data on the KIDS COUNT Data Center. For example, if they have a story or post on infant mortality changes at the national level, they will link to the state data on the KIDS COUNT Data Center.
Staff we interviewed used a mix of methods to evaluate whether their site was meeting their users’ needs. All mentioned examining web statistics through tools like Google Analytics, though we did not collect figures. Key metrics included the number of unique users, the number of returning users, and the amount of time spent on the site. Community Commons also provides separate quarterly reports with these statistics to each of their Hubs.
All of our respondents acknowledged the difficulty of knowing how audiences use data websites in effecting change in their communities, or even who is using the site. The website manager from Community Commons noted that the ultimate measure of success is what people do off of the site, which web managers capture in various ways described below. All the people we interviewed related informal stories of use or case studies. As one example, Community Commons has “Member Spotlights.”5 Measure of America’s “Impact” section of their website features content that ranges from more quantitative items (like citations or data downloads) to more qualitative summaries of their local engagements.
Influence often takes time. The Sonoma County Department of Health Services compiled a presentation one year following the Measure of America study mentioned above to present 50 ways that the information from the Portrait was used.6 These examples ranged from using the information in strategic planning, in successful grant applications, or in targeting programs.
More formally, Community Commons and Equity Atlas are conducting surveys with users. The former sent the survey their list of registered users, since they do require signing up to use the site. The Atlas survey was sent to the list of people who have signed up to receive news and updates, and just has a simple text box asking how people are using the site’s data. The KIDS COUNT team has previously done surveys of particular audience segments, like state legislators or county officials to measure their awareness of the Data Center and how they may be using it as a resource.
Earlier this year, the Annie E. Casey Foundation published a report titled “From Project to Platform: The Evolution of KIDS COUNT” (Annie E. Casey Foundation 2016). It covers the development from the hardcopy data book in 1990 and the introduction of policy essays and special reports. It describes the Data Center’s role in the initiative, and the critical role of its state affiliates as communications and learning partners.
Future of Online Platforms
The field of online data platforms will continue to get more crowded. Three of the interviewees mentioned that they foresaw a continued growth in the number of tools, but that they would be tend to be more simplified. Reasons driving this trend included the need to adapt to users accessing the site through mobile phones and tablets that do not lend themselves to complicated interfaces, and the desire to make it easier for users, particularly those without extensive technical skills, to fulfill tailored tasks or information needs. Measure of America and PolicyLink mentioned the desire for greater integration of the different components of their sites so users could navigate more seamlessly among mapping, stories, and other resources.
Counter to streamlined tools, we noted earlier that four of the interviewed staff remarked on their audiences’ desire for more ways to break down the indicators, whether by demographics or geography. As an example, PolicyLink recently added data on population by Asian-Pacific Islander ancestry (Chinese, Japanese, etc.) in response to requests from the site’s users. Challenges remain about maintaining user-friendly design with additional data and data availability or reliability. Based on the Gallup-Healthways experience, the Healthways staff emphasized the need for more current information, without the year or more lag time required for many of the data sources.
All those interviewed emphasized the need to focus on the value the websites provide to the users and making the data “actionable.” For example, the Gallup-Healthways team revised the Well-Being Index in 2013 to focus more on indicators that could be targeted by interventions. This theme was also reflected in PolicyLink’s strategy of providing advocates ready-to-go narratives. Two of the site managers commented on the need for tools that help organizations working together to track indicators over time or assist in managing a collaborative’s performance.
Ideas for Partnership
The managers shared a few specific ideas for CHRR to partner with the other websites, and all expressed interest in exploring options. In some cases, these teams are already interacting with the CHRR website. We have several additional suggestions based on the conversations. Since we did not complete a comprehensive review of CHRR partnering, some of these may already be CHRR practice.
One concrete way to partner is to share raw data. We understand that the Measure of America website already relies on some CHRR indicators. Increased publicity and outreach about the very organized and easy-to-use “Download Full Data Sets” section could result in additional value and a greater variety of tools for users from the CHRR investment. Potential audiences could be civic tech community, professors with assignments for undergraduate or master’s classes, and journalist networks.
There are already some examples of direct funding relationships connecting the hosting organizations and the Robert Wood Johnson Foundation (RWJF). The KIDS COUNT representatives indicated that RWJF already funded some of their local grantees through Roadmaps to Health Community Grants. RWJF is also supporting PolicyLink to work with health coalitions to develop equity profiles. These would be opportunities to explicitly leverage and connect with the CHRR website and services.
Cross-promotion of events like webinars and major releases like annual reports is already happening with some of the websites, but there seem to be opportunities to do so more systematically. From our scan, the coaching services that CHRR offers are unique. Depending on CHRR staff capacity to serve additional clients, the other health-related data platforms would be excellent vehicles for outreach to new community audiences who could benefit from the coaching.
For the CHRR users (and the users of all of the data sites), it would be a service to have a summary document with brief guidance on which site is most useful for which questions or tasks or audience. This would enable the users to be aware of and take advantage of complementary data and features in other websites. For example, guidance like this would facilitate CHRR users accessing a broader array of data (indices and indicators by race or for smaller geographies) or tools like mapping or Community Commons hubs. The CHRR Tools section already has at least some of these websites, but at minimum, a new category to segment out online data and tools would help the users distinguish the websites that could help with data access and analysis from the hundreds of blogs, reports, or guides.
Beyond referrals, we see more potential for connected content development. Examples could include guest or reposted blogs and stories. If the organizations shared advance notice about relevant topics, CHRR could take the opportunity to highlight related data, policy interventions, or local action stories.
Longer term, the Community Commons manager identified the potential to advocate together for new data collection and development for the field. This could be for inadequately covered topics like mental health or additional geographic levels (county child obesity as an example) or racial breakdowns. Participation in organizations like the Census Project, which advocates for quality Decennial Census and American Community Survey data, or the Association of Public Data Users, which connects users to many federal agencies, offer other venues to engage in the national conversation about what data investments could help those working to improve communities.
Finally, the individuals that we interviewed were all very generous in sharing their time, experiences, and insights. We believe it would benefit the CHRR team to continue to cultivate a community of practice of hosts of online data tools. This circle can serve as a source of mutual advice and shared knowledge for the CHRR team in particular and would strengthen all efforts to provide easy-to-use, relevant data for advocacy, decisionmaking, and community action.
- “A Portrait of Sonoma County,” Measure of America, accessed July 25, 2016,
- “Equity Profiles,” National Equity Atlas, accessed July 25, 2016, http://nationalequityatlas.org/reports/equityprofiles.
- “Opportunity Youth Network,” Aspen Institute, accessed July 25, 2016,
- “Disconnected Youth,” Measure of America, accessed July 25, 2016,
- For example, see Andria Caruthers, “Member Spotlight: Bringing ‘Fresh Food to All’ to Hamilton County, TN,”
Community Commons, last updated April 11, 2016, http://www.communitycommons.org/2016/04/memberspotlight-bringing-fresh-food-for-all-to-hamilton-county-tn/.
- “A Portrait of Sonoma County,” County of Sonoma, accessed July 25, 2016, http://sonomacounty.ca.gov/a-Portrait-of-Sonoma-County/.
Annie E. Casey Foundation. 2016. “From Project to Platform: The Evolution of KIDS COUNT.” Baltimore: Anne E. Casey Foundation.
Burd-Sharps, Sarah, and Kristen Lewis. 2014. A Portrait of Sonoma County: Sonoma County Human Development Report 2014. New York: Measure of America.
Catlin, Bridget. 2014. “The County Health Rankings: ‘A Treasure Trove of Data.’” In What Counts: Harnessing Data for America’s Communities, edited by the Federal Reserve Bank of San Francisco and the Urban Institute, 59–73. San Francisco: Federal Reserve; Washington, DC: Urban Institute.
About the Authors
Kathryn Pettit is a senior research associate in the Metropolitan Housing and Communities Policy Center at the Urban Institute, where her research focuses on measuring and understanding neighborhood change. Pettit is a recognized expert on several small-area local and national data sources and on the use of neighborhood data in research, policymaking, and program development. She has conducted research on many topics, including student mobility, neighborhood redevelopment, federally assisted housing, and local housing markets and conditions.
Brent Howell is a research associate in the Metropolitan Housing and Communities Policy Center, where he contributes to projects studying housing access, youth homelessness, supportive housing for child welfare–involved families, and performance management. His work focuses on housing stability, the role of supportive services in housing, and the development and sustainability of affordable housing.
This guide was supported by the Robert Wood Johnson Foundation. We are grateful to them and to all our funders, who make it possible for Urban to advance its mission. In particular, the authors would like to acknowledge Andrea Ducas (program officer at the Robert Wood Johnson Foundation) and Bridget Catlin (retired senior scientist at the University of Wisconsin Population Health Institute and former codirector of County Health Rankings and Roadmaps). They provided invaluable input during data collection and comments to improve the draft. The report would also not have been possible without the generous contribution of the experiences and insights of the website managers we interviewed: Erin Barbaro (Institute for People Place and Possibility, Community Commons), Laura Speer and Florencia Gutierrez (Annie E. Casey Foundation), Kristen Lewis (Measure of America), Lindsay Sears and Susan Frankle, (Healthways) and Sarah Treuhaft (PolicyLink). Leah Hendey provided advice throughout the project, and Ruth Gourevitch provided helpful research assistance to complete the report.
The views expressed are those of the authors and should not be attributed to the Urban Institute, its trustees, or its funders. Funders do not determine research findings or the insights and recommendations of Urban experts. Further information on the Urban Institute’s funding principles is available at www.urban.org/support.