FacebookTwitterEmail

The International City/County Management Association (ICMA) was established in 1914 amid an atmosphere of broad-scale mistrust of city government. Its founding members sought to bring to municipal management a dedication to both ethics and professionalism to help restore public trust. Although the level of corruption in local government no longer compares to that of the early 1900s, several notable modern abuses of public trust—Bell, California; Detroit, and New Orleans—demonstrate that fraud and gross mismanagement still happen. During the first 100 years of ICMA’s history, the organization has continued to see an accelerated demand for accountability and transparency locally as national media highlight mistrust and fiscal challenges.

The remarkable growth in the availability and usability of relevant data has both enabled and driven the expansion of local government performance management to respond to these challenges. The introduction of computers in the workplace in the 1980s enabled local agencies to automate their administrative records, making data much more reliable and easy to collect and analyze. For example, the ability to easily sort data and tabulate results allowed observers to understand whether agency performance was improving or declining and by how much. New data management systems for geo-spatial analysis and mapping, customer service, financial planning and assessment, and web use all provide the means to analyze and compare how effectively and efficiently local government is being managed.

In just the past few years, more timely and a greater variety of available data have enabled entirely new ways of conducting municipal business. From mobile devices placed in city vehicles to sensors on water meters, thermostats, and traffic signals, new forms of collection are allowing local governments to access unprecedented amounts of data in nearly real time. For example, the City of Rancho Cucamonga, California, has implemented a city sidewalk inspection program. A simple mobile app enables city staff to inspect sidewalks and document problems using nothing more than their smartphones. Even five years ago, such a program would have been technically impossible to institute.

But to improve performance, governments must create mechanisms to integrate the data into operational processes that improve the efficiency and effectiveness of government programs. Progress in this direction accelerated in the 1990s with the publication of Reinventing Government, which galvanized the practice of “results-oriented” management in the public sector.[1] The authors introduced the concept that government managers could use data to improve operations and meet citizens’ expectations just as the private sector does.

In 1994, ICMA established the Center for Performance Measurement, now known as the Center for Performance Analytics, to advance these ideas. The center established the first national database of more than 5,000 measures used by local governments to gauge performance. One of the great advantages of this database is that it allows local officials to see how their own performance compares with similar local agencies elsewhere. This sort of comparative analysis, followed by reflection or studies of possible causes, can help establish useful benchmarks and a more nuanced understanding of the forces behind organization-wide performance.

By the late 1990s, a number of localities wanted to improve the effectiveness of their performance management efforts. The most prominent response was Baltimore’s CitiStat program in 1999. The CitiStat program evaluates how efficiently city departments deliver services, and measures their performance in meeting mutually agreed-upon service delivery goals. Like most such efforts, this approach entailed a series of departmental meetings to review updates on a set of preselected performance measures. But CitiStat is distinguished by several features that motivated all participants to give it priority attention. Most important was the active, direct involvement of the mayor and other high-level officials. Department heads typically ran the meetings, and all staff members in attendance knew the mayor was regularly reviewing the results. Second, the meetings were held regularly and frequently. As a result, staff made extra efforts to generate better data to devise and track metrics that would be reliable and meaningful. They were also more careful in setting performance targets. As a result, city departments have significantly improved their performance and saved the city money in the process. The program, also known as “PerformanceStat,” became a national model for using data to improve government performance. It has since spread to many other U.S. cities and some state agencies.

The initial CitiStat process had a reputation for taking a tough-minded approach when performance targets were not being met. As the model spread, it has evolved beyond its initial focus on poor performance to include a continuous improvement approach. In order to instead foster collaborative problem-solving, some local governments have adopted a “think tank” approach that enables executives and other leaders to analyze and propose new solutions when service problems are detected.[2] The focus is on using the data to learn about what is working, what is not, and why—in other words, using data to provide a sound basis for devising and adjusting strategies to truly improve results.

Increasingly, the practice of performance management is evolving beyond performance metrics. Mapping, in particular, generates potent new information. For example, Minneapolis 311 staff mapped service requests for nuisance complaints by supervisory districts and realized that the district with the most complaints had received twice as many as the district with the fewest.[3] Yet both district offices had the same number of support personnel. Likewise, the more complete data now available on the demographics of a neighborhood (for example, the proportion of children versus elderly) can be used to adjust the types of services provided in a neighborhood park or social service programming and thereby increase use rates and improve outcomes.

In an era of “big data,” ICMA has recognized the need for greater analytic capability. Local governments require real-time data to proactively deliver services in communities. ICMA Insights, a new performance management software platform, automates data entry and introduces significant new tools for data-mining, analysis, and data visualization. More important, the new platform enhances the ability of local governments to respond to citizen demands for greater transparency. Daily, weekly, or monthly monitoring of performance metrics trends allows greater ability for managers to use predictive analytics to alter processes prior to failures or underachieving results.

The explosion of data available to local governments, along with increased pressure from the public for “open data” to assess government performance and program results, has already changed practice in dramatic ways. Going forward, local government employees will be much more likely to analyze rather than process data. Technological advances and software tools will continue to make data analysis easier. Better analytic tools, along with timely data, will help local governments re-engineer business processes and procedures, leading to improved service delivery, enhanced customer service, and greater transparency and accountability.

[1]   D. Osborne and T. Gaebler, Reinventing Government: How the Entrepreneurial Spirit is Transforming the Public Sector (Menlo Park, CA: Addison-Wesley, 1992).

[2]   C. Fleming, “Technology, Data and Institutional Change in Local Government,” In Strengthening Communities with Neighborhood Data, edited by G. Thomas Kingsley, Claudia J. Coulton, and Kathryn L. S. Pettit (Washington, DC: Urban Institute Press, 2014).

[3]   C. Fleming, “Minneapolis 311 System,” Call 311: Connecting Citizens to Local Government Case Study Series (Washington, DC: International City/County Management Association, 2008).