San Francisco and Administrative Connectivity

A tool supported by the San Francisco Controller’s Office, provides city residents a “barometer” for evaluating government performance.

by Will Cook
/
July 26, 2013
0

Flickr/Mark Gunn

This story was originally published by Data-Smart City Solutions. It is the fourth in a series on performance measurement. Read the first,second, and third articles.

San Francisco Performs, a tool supported by the San Francisco Controller’s Office, provides city residents a “barometer” for evaluating government performance. Similar to systems like Boston About Results, the barometer is a collection of performance data framed around functional areas such as “Public Safety” and “Streets and Public Works.”

Boston and cities with similar programs publish hundreds of metrics, aggregating results at the strategic and functional levels. San Francisco has chosen to provide a smaller number of indicators in one focused document. Numbering under 50, these data points form an easily digestible view of performance at the city level. More limited scope, but less diffused reporting, demonstrates a slightly different solution to the question of how best to balance comprehensiveness and interpretability. San Francisco’s barometer loses some detail as compared to larger systems, but it allows the city to release one five-page report covering numerous city functions.

San Francisco Performs does, however, still come in a few different flavors. Eight of the nine areas reported by the city exist within a quarterly published PDF report on the city’s website and through a recently-released online portal. The ninth area, the city’s economic barometer, exists only as a stand-alone portal. San Francisco has used free open-source tools to develop both portals, making the system a lean, low-cost operation. However, users must still look to both of these tools to gain a complete picture of city performance. The most interesting part of San Francisco’s performance reporting effort may be this administrative heterogeneity.

The Office of the Controller further dedicates a page of their site to “related efforts,” areas beyond the two barometers where San Francisco is engaging in dialogue with the public on performance. These include a bi-annual city survey, reporting on 311 data, street and sidewalk maintenance reporting, park maintenance reporting, and a now defunct ‘Stat’ program. Other activities are also being performed by San Francisco’s innovation office. Data.sf.org offers mostly development and innovation enablement tools, but it also provides reporting in areas like city crime, offering a customizable mapping tool to show where incidents have taken place over a given period of time.

San Francisco certainly doesn’t suffer from a dearth of innovation or reporting. And the barometers offer clean overviews of city performance in multiple areas. However, it seems more difficult to pull all of this existing reporting together into one larger integrated view. The advice for other cities from Kyle Burns, San Francisco’s Performance Measurement Program Lead, is therefore unsurprising. He advises cities to follow an “integrated approach” from the beginning. Burns emphasizes that if systems are allowed to grow organically in parallel, the difficulty of later pulling them together is exponentially greater.

This administrative separation among reporting areas is evident in almost all cities reporting performance to the public. The types of scorecards based on traditional PerformanceStat programs typically exist as part of a Controller’s office or administrative department. At the same time, a whole new crop of citizen-focused tools are being developed by Chief Innovation Officers, civic innovation departments, and multi-city collaboratives such as the New Urban Mechanics franchise in Boston and Philadelphia. These new departments and organizational structures are producing valuable tools and resources, but may also be serving to further complicate the administration of citizen-facing performance reporting.

Creating an integrated program from the outset may be ideal, but cities with already disaggregated reporting still face the challenge of how best to integrate. There are a number of things that can be done to help accelerate this process. First, the largest challenge in overcoming administrative siloing is naturally cultural, and a city’s response should be planned accordingly. Rather than giving reporting to one administrative unit and asking others to cede control, cross-functional teams and working groups can help engage all parties in advancing a single multi-party solution. The project should also be clear to emphasize citizen experience and usability over internal city organizational structures and responsibilities. This will require convincing agencies that they win when their message gets to the public most effectively, not when they own the process for delivering the message. Finally, patience may be the most valuable asset. As is the case with any large system implementation, a hastily executed solution usually spells disaster. Providing multiple disaggregated views of city performance is better than providing one inaccurate view or no view at all.

In the next piece in this series, Colorado Springs will be examined as an example of how external parties and data can be used to bolster citizen-facing performance reporting processes.