Helping communities become healthier places to live, learn, work, and play means attending to many interrelated factors. These include health factors such as access to clinical care and improvements in healthy behaviors, such as diet and exercise, but also social and economic factors, such as neighborhood safety, employment, housing, and transit. By monitoring these factors, we can identify avenues to create and implement evidence-informed policies and programs that improve community well-being and health.
The County Health Rankings, a collaboration between the Robert Wood Johnson Foundation and the University of Wisconsin Population Health Institute (UWPHI), aim to do just this. The rankings are unique in their ability to measure the overall health of each county in all 50 states on the multiple factors that influence health. The rankings provide communities with insights on a variety of factors that affect health, such as high school graduation rates, access to healthy foods, air pollution levels, income, and rates of smoking, obesity, and teen births. The model underlying the rankings underscores that much of what affects health occurs outside of the doctor’s office, and stresses that factors such as education, employment, income, the environment play critical roles in determining health and life expectancy.
The goal of the rankings is to help stakeholders understand the many influences on health and vitality and inspire community-level change. My colleagues and I at UWPHI determine the rankings using measures from several publicly available, national data sources. We standardize and combine the measures leading to two overall rankings:
- Health outcomes: how healthy a county is now.
- Health factors: how healthy a county will be in the future.
In this essay, I discuss the key lessons we’ve learned during the past decade about how to effectively design, display, and use rankings to mobilize data-driven action to address the multiple determinants of health.
The County Health Rankings has its origins in America’s Health Rankings, which since 1990 have ranked states on health indicators. Curious about why their state’s rankings rose and fell over time, researchers Paul Peppard, David Kindig, and Patrick Remington at UWPHI wondered if health, like politics, is local. They delved into measuring the health of Wisconsin’s counties and released the first Wisconsin County Health Rankings in 2003. During the next few years, leaders in other states became interested in using UWPHI’s approach, and in 2009, with funding from Robert Wood Johnson Foundation (RWJF), we began our work to expand the rankings to other states. The following year, RWJF and UWPHI released the first national County Health Rankings, which led to widespread media coverage. Wanting to help communities move from data to action, a year after the initial release of the rankings, RWJF funded a series of activities known as Roadmaps to Health to help communities use the data from the rankings and engage stakeholders from multiple sectors in setting priorities and implementing strategies to improve health.
The 2014 rankings are based on 34 measures, with an additional 40 measures reported to provide context. Combined with the underlying data supporting the current rankings and all the data from prior years, this “treasure trove of data” now contains more than 1 million data points. This data, along with detailed documentation about calculation methods, are easily accessible and downloadable at www.countyhealthrankings.org. RWJF and UWPHI plan to produce rankings for at least four more years.
Why has RWJF committed to producing annual rankings of the health of every county in the nation? The rankings support RWJF’s goal to build a culture of health by raising awareness of the multiple factors that influence health and stimulating and supporting local action to improve health by addressing these factors (Figure 1).
Designing the County Health Rankings Model
The UWPHI team determined that to best educate lay users about how the rankings capture population health, a graphic model was needed to clearly depict both the types of measures included and how they are calculated. The design has evolved over time. Figure 2a shows an earlier version depicting health outcomes and health determinants, and Figure 2b shows the latest version. The design of the model has evolved over time to help emphasize the role that factors such as education, jobs, income, and environment play in how healthy people are and how long they live. One notable change was in terminology, from using the term health determinants to health factors to make it more intuitively understandable. The newer model also conveys that policies and programs fundamentally influence a variety of health factors, which in turn shape community health outcomes.
The right sides of both models delineate health outcomes and health factors. But the newer model moves away from listing specific measures and, instead, combines the individual factors under broader headings (e.g., diet and exercise now encompass physical inactivity and fruit and vegetable consumption, etc.). This allows the model to remain relatively consistent from year to year even while improvements are made to the underlying measures. Another distinction is the new color scheme. Throughout the County Health Rankings website, health outcomes (frequently described as “today’s health”) are depicted in green and health factors (referred to “tomorrow’s health”) are depicted in blue. Such design changes may seem minor but can be important in improving communication about a complicated set of measures.
The adage “a picture is worth thousand words” rings true for the County Health Rankings model. The model does double duty by both providing a high-level overview of how the rankings are constructed and by illustrating that many community factors contribute to health outcomes. For this reason, UWPHI makes this image available for download with no restrictions on its use, other than citing the source.
Allure and Perils of Rankings
A ranking is appealing because it simplifies complex data into an easily understood measure. Because it is headline-grabbing—and appeals to people’s competitive nature and desire to do better—a ranking can generate attention toward specific issues and prompt action by community leaders, politicians, funders, and community residents.
But the simplification comes with a cost—the loss of information. This can mean that the true differences in health standings between counties can be hard to gauge. For example, the top-ranked county may be significantly healthier than the county ranked second, whereas the county ranked second could be barely different from the county ranked third. The County Health Rankings tries to overcome this issue by assigning each county to one of four quartiles, communicating that differences among counties in the same quartile are generally less important than differences between the four main clusters of counties. (There are of course exceptions, as counties at the very bottom of a quartile may be similar to those at the very top of the next.)
Because the rankings cannot, by design, tell the complete story, people are encouraged to use the rankings as a starting point only. Rankings, for example, are relative, not absolute, and are thus not necessarily a reliable way to measure progress. A county’s ranking reflects not only its own performance, but also that of every other county in a state relative to it. If one county’s health improves at the same rate as every other county in the state, its rank will stay the same, masking the real progress the county is making. In addition, place-based rankings can be unstable for areas with smaller populations, meaning that some variation in ranking from year to year can be anticipated due to the lower reliability of estimates when numbers are small. In 2014, the team added a new tool to help communities measure progress using specific metrics, such as those on which the rankings are based, or measures from other data sources that better lend themselves to tracking over time.
Another potential danger of rankings is that those counties that rank highly within their state may not feel the imperative to improve. To offset possible complacency, for most measures the tool reports the value of “Top Performers,” the point at which only 10 percent of counties in the nation are performing at or above. Few counties are at or above this value across all measures, so this helps communities realize that even highly ranked counties have room to improve.
On the flip side, counties ranked low can feel like “losers.” Our experience in Wisconsin showed that a common first reaction to low rankings is a mix of denial and anger. We’ve seen leaders in public health and health care sectors question the veracity of the data or feel that they were being blamed for things beyond their control. However, when provided with an explanation about the source of the data and engaged in a discussion about the many factors and stakeholders contributing to health, many community leaders reframe the results as a call to action.
Another peril is that rankings can perpetuate existing problems if decision makers choose to reward the best performers and penalize the worst. One of the challenges of producing any reporting system is that people will use data to suit a variety of purposes. We urge decision makers to use data from the county rankings to help allot resources to needier places and to recognize that improvements in health can come by investing resources in a variety of settings (i.e., not only in health care).
There is no one right way to either choose measures or combine them into a set of rankings, and we had to make several key decisions in constructing the rankings. First, we decided to rank counties within states rather than ranking all U.S. counties against one another. Because we want to spark local action, it is far more helpful for a county to see its ranking within its state than be ranked as one among 3,143 counties in the nation. In addition, some measures are context-dependent and not comparable across state borders, making ranking among states ill-advised.
Second, we determined which measures to include in the rankings and how to weight them. We first looked for measures that are valid and reliable, available at the county-level, preferably updated annually, and available at no or low cost. The five measures used to construct the health outcomes rankings (premature death, poor or fair health, physically and mentally unhealthy days, and low birth weight) are based on the most current data available that can be used to characterize the overall health of counties. Because we wanted the rankings to prompt policy and behavior change, an additional criterion is that the measures of health factors must be actionable. Although genetics clearly influence health, there is no policy change that can affect genetics. Therefore, there is nothing in the rankings reflecting this factor.
Our final guiding principle for selecting measure was that less is more. One of the purposes of the County Health Rankings is to engage people who do not traditionally consider themselves public health (or data) experts. We’ve learned from experience that too much data can be off-putting and confusing for users.
After the first year we had to decide to either leave the measures unchanged or encourage communities to explore new or additional factors. Leaving the metrics unchanged allows users to compile and track trends. However, allowing changes can offer new insights to communities when new or improved measures become available. The UWPHI team settled on a strategy of keeping the same measures for health outcomes but revising those for health factors as we identified better measures. We likened this decision to the educational metric of a grade point average, which provides a standard, overall metric but—in doing so—can be based on grades from previous, current, and future courses.
Finally, we had to decide how often to update the rankings. We ultimately decided to update the rankings annually even though some of the measures do not change significantly from year to year. Our rationale is that producing data on a regular basis facilitates widespread media attention and enables more people to hear the call to action each year.
On the whole, we attempt to make our decisions as transparent as possible and encourage discussion of the issues underlying our process. In this way, users will understand not only their rankings but also the underlying data and methods. However, we must continually balance the need for simplicity with the need for detailed explanation.
Visualizing Rankings and Underlying Data
Visualization tools help users with different levels of data skills find meaning in the data. The visualization approach used in the County Health Rankings builds on the organizational structure in Figure 2b. The County Health Rankings website relies heavily on tabular display of data. In many of the tables, users can sort data in different ways, and most tables are layered so users can delve deeper than the initial overview data display. A pull-down menu allows users to access data from prior years. Not surprisingly, fewer visits are made to the more detailed data pages on the website, but all the details and associated documentation are available for those who are interested.
Even with the layered structure, more than 70 measures for all 3,100 counties can quickly become overwhelming. Charts and maps help make the data more accessible. Graphs are useful for highlighting trends for individual measures. An interactive map draws users into the data. Maps add context well beyond what a data table provides. See, for example, the two maps in Figure 3. The health outcomes map (green) shows the location of the healthiest and least healthy Alabama counties in 2014, with the counties divided into quartiles, and a similar map (blue) shows where the counties are based on the factors that influence health. These maps show the strong association between health outcomes and the factors that determine health (lighter colored counties are the healthiest in terms of both outcomes and factors). In addition, the maps show that place matters, even in states known to be less healthy than others.
RWJF has also created add-ons, such as a Facebook application (Figure 4), that can organize data in a more visually appealing manner.
We have also wrestled with the question of how extensively we should use design elements to help users draw inferences from the data versus letting users themselves interpret the rankings data and identify concerns. We strike a balance by providing a guide that walks users through the data and features and that provides suggestions for how to interpret the data. In addition, users can turn on the “Areas to Explore” feature. This feature highlights the measures in a particular county that are significantly different from state or national averages.
As we expanded our efforts to all counties in all states, RWJF helped us think through our goals and develop a strategic communications plan to get our messages into the media. With the assistance of RWJF, its communications team, and County Health Rankings contacts in each state, we develop targeted press releases each year, including national, state, and local releases in some states. Because counties are ranked within states and not on a national basis, the County Health Rankings are best suited for state and local coverage, but national media outlets often press us to compare counties across states to create a national ranking of counties. Although not doing so has cost us some national coverage, the strength of state and local coverage makes up for it. We have also learned that, however creatively we display data visually, we must discuss our data in a clear and compelling manner. We work closely with communications experts to develop messages for different audiences, focusing particularly on nontechnical audiences. This sometimes requires less focus on scientific precision and more on accessibility and comprehension.
Moving from Awareness to Action
Data alone do not spark action to improve community health. People need help determining their next steps and often want access to customized help. To respond to these needs, RWJF added the Roadmaps to Health in 2011. The roadmaps help users identify actions that can improve health. The Roadmaps to Health Action Center provides guidance and tools to support community health improvement. Users can access detailed guides that correspond to each of the steps in the outer circle of Figure 5 and information about specific steps that the entities identified in the middle of the circle can take.
Furthermore, because action to improve our nation’s health cannot be automated, the Action Center is staffed by full-time community coaches who provide guidance via e-mail or phone, in addition to in-person visits to communities that have indicated a readiness to collaborate in improving community health. The coaches work with individuals and teams in communities that are at various stages in their journey toward improved health. Coaches also teach community members how to use the rankings to raise awareness of the multiple factors that influence health, identify areas for improvement, and demonstrate how to investigate other data sources for a more detailed understanding of problems within their communities. Then, coaches engage with community members to select priority areas (based on data and other considerations), choose evidence-informed policies and programs, implement these strategies, and evaluate the success of their efforts.
Communities large and small are working to make their citizens healthier, increasingly focusing their efforts on the social and economic determinants of health. For example, the first release of the County Health Rankings in 2010 prompted rural Mason County, Washington, to focus on improving education pathways for its young people. “Mason Matters” and its partners are implementing career and college-readiness programs targeted to youth in Grade 4–8.
Roadmaps has also included grants to coalitions working to improve the health of people in their communities; grants to national organizations that are mobilizing local leaders and affiliates; and the RWJF Culture of Health Prize, a program to recognize communities whose promising efforts will likely lead to better health. Organizations such as United Way Worldwide and the National Association of Counties are using the rankings to inform their work.
Starting with data but now focusing on action, the County Health Rankings and Roadmaps program is helping communities create new pathways to better health. Representatives from local schools, churches, law enforcement, business, hospitals, government, nonprofit organizations, and ordinary citizens are coming together to improve health and develop innovative approaches to reduce smoking, expand access to healthy foods, increase high school graduation, develop more bike- and pedestrian-friendly neighborhoods, and much more.
Our work has taught us several key lessons. First and foremost, keen attention to communications and visualization strategies is required to effectively transform data into usable information. Many data analysts are not attuned to strategic communications planning, but this process was pivotal in helping us articulate what we wanted to accomplish through the ranking, and how to reach and motivate key audiences toward action. In addition, we learned a data tool is not a “Field of Dreams”—even if you build it, users will not automatically come. A compelling “hook” is essential to draw in users. In our case, the hook is rankings; other hooks may be appropriate in other situations.
A layered approach that allows users to choose the level of detail that best suits their needs is also critical for enabling users to navigate the data. Clear and accurate documentation is also important; users want and need to know the source of the data and how the data were collected, as well as more detailed information about time frames, sample sizes, and other details. We strive to help all our users—whether advocates, policymakers, or practitioners—unpack the rankings by offering the underlying data, additional tools, and guidance needed to move from data to action.
Additionally, the perfect must not be the enemy of the good; no perfect measures of health or its determinants exist. Instead, we strive to report the highest-quality data that we can obtain while acknowledging and reporting the limitations of our data.
Finally, compiling and presenting data in interesting ways are still insufficient to generate action; effective dissemination and customer service are essential. Developing appropriate messages and engaging the media (broadcast, print, online, social, etc.) are key components of a successful strategic communications plan. It is also important to prioritize responsiveness to media requests.
Going forward, a new Scientific Advisory Group of national experts representing key stakeholders will help guide the work over the next four years. One area for further exploration will be helping communities drill down from county-level data to more local data about neighborhoods and different subgroups of the population to identify and address disparities within counties. Ongoing improvements to the website and to the Roadmaps to Health program will ensure that communities can effectively translate available data into evidence-informed strategies to improve health and well-being.
 M. Beck, “How Healthy Is Your County? A New Data Trove Can Tell You” (Wall Street Journal, April 3, 2012, available at http://blogs.wsj.com/health/2012/04/03/how-healthy-is-your-county-a-new-data-trove-can-tell-you/?mod=WSJBlog.
 The Measuring Progress tool is available at available at http://www.countyhealthrankings.org/measuring-progress.
 The process of establishing weights for each component of the model was guided by historical perspective, a review of the literature on the effect of various factors on health outcomes, weights used by other rankings, our own analysis, and pragmatic issues involving communications and stakeholder engagement. See Bridget Booske et al., “Different Perspectives for Assigning Weights to Determinants of Health.” Working paper. (University of Wisconsin, Population Health Institute, 2010), available at www.countyhealthrankings.org/sites/default/files/differentPerspectivesForAssigningWeightsToDeterminantsOfHealth.pdf.