December 2006

Editor's Note: Included in this issue are two articles dealing with performance measurement that are presented in "point/counterpoint" perspectives. The counterpoint version is on p. 30.

Somewhere out there is a Public Works Director who has free time, but I have yet to meet this person. So, when the nice guy from the Budget Office stops by to say we're going to do "performance measurement" I can hardly contain my excitement. Oh good, I think, I really need something else to do.

I'm sure many of you have been in this situation where the responsibilities of performance measurement are added to an already full plate. Most of us know where our performance falls short and usually it is because we don't have enough resources to do better. So, the idea of spending scarce resources to not only tell us what we already know, but make the situation worse by spending less time on the problem, makes me skeptical.

In concept, performance measurement is a tool to evaluate the effectiveness of the allocation of your resources against goals, expectations and benchmarks. More importantly, performance measurement should lead to better performance—right?

As a former Public Works Director and current Deputy City Manager, I have been struggling with this question, especially since we have initiated a strong focus on performance measurement in Palm Bay. My observation over 20 years of government service is that performance measurement is a great sound bite, but in reality it can easily become an administrative behemoth that consumes resources to produce complicated data and analysis that nobody reads. I have asked myself the question: If the person compiling and analyzing data in the Public Works Department was filling potholes instead, would that be a better investment of public resources? The key is finding the proper balance between measuring performance and actually performing.

So, how do you start? First, your organization must determine where it is going. It is impossible to measure performance against no direction. Is filling potholes more important than storm drain maintenance? What is the expected response time to fill potholes: 1 day, 10 days, 30 days, 6 months? Depending upon a community's resources and priorities, any of these response times might be acceptable. If your average response time is 10 days, is that good or bad?

Any useful performance measurement system must fit in with strategic planning efforts. If you can identify what your community wants (or doesn't want) and work towards that vision, performance measurement simply becomes a means by which to evaluate progress. If you're measuring only because somebody elsewhere in the organizational chart says you have to measure, then I would speculate that you're not getting much value out of measuring and it probably is hindering rather than helping your performance. If this is your situation—where measurement is an edict from above, but is not giving you much value—then you have to take the responsibility for evolving your department towards meaningful measurement.

Since we're all in a technical field, measuring should be easy, right? Well, you're more likely to discover that measuring for value is really more of an art. I'll give you a few examples to illustrate.

The hypothetical fleet manager believes that preventive maintenance is good for the fleet—no argument there. The manager establishes that every light vehicle should have its oil changed within a range of every 3,500 to 4,500 miles. The manager starts tracking this and discovers that most police cars are getting their oil changed every 5,000 miles, which is a clear violation of the policy. So, in order to achieve compliance, the manager shuts off gasoline service to anyone whose vehicle has gone more than 4,500 miles without an oil change. This works great for fleet, as the delinquent vehicle is now sitting at the gas pump, so it can be serviced the next day; however, it has also resulted in police officers unable to patrol. So, in the context of fleet, this manager has improved performance. However, in the context of service to citizens, this might not be the best approach.

What about overtime? Is increasing overtime in your department an indicator of providing after-hours services to our citizens, a reflection of a managerial decision not to add more full-time employees, or simply an indicator of mismanaged time? Look at your budget metrics. Many communities measure something like percent of budget spent. But what does that really mean? If you only spend 80% of your budget are you saving taxpayers' money or not getting the job done?

In another example, a manager might find that if the department reduces the number of inspectors, then the number of failed inspections goes down. Following this logic, you could eliminate failed inspections completely if you also eliminated all of your inspectors!

Trend analysis presents similar challenges. In our community we study the number of drainage complaints. This number is influenced by many factors within our control (resources dedicated to addressing complaints, determining how to count the same person making six complaints, how to count complaints that are next door to one another) and many factors outside of our control (weather, equipment malfunction, extent of other priorities, whether or not the phone system is working, number of new homes being built in areas with poor drainage). If you give this data to 10 different Ph.D. statisticians, you will probably get 10 different observations about our drainage complaints. But if you ask our field superintendent, he/she can probably tell you where we have problems, when we have problems and why we have problems. He/she can also probably tell you about what you need to do to fix the problems. So, what does an agency really gain from trading off resources from working to measuring? If it makes more sense to talk to your field people, start there for your data and analysis.

There is also a serious risk of performance measurement serving as a disincentive towards good performance. If a standard is established, efforts are focused on meeting or exceeding that standard (especially if there are negative consequences for not meeting the standard or accolades for meeting the standard). However, the more resources are placed on measured activities, the fewer resources are placed on unmeasured activities. Conceivably, as measured activity improves, non-measured activity (which is probably still important) declines. An unbalanced system can easily result. In addition, the consequences of not meeting a standard send a message too. If consequences are too dire—resources are overallocated; if consequences are insignificant—an attitude of "nobody cares anyway" results. So the thoughtful management of any performance measurement system is critical to achieving the desired results.

One thing I've never seen (perhaps it exists somewhere, but I haven't found it) is an agency that measures the performance of its performance measurement investment. If you add up the man-hours spent gathering data, analyzing data, reporting data, meeting about data and otherwise measuring sometimes less than meaningful metrics, is this really the best way to spend our time? In some cases tolerating some inefficiencies might be less costly than investing huge resources in finding them. A good manager needs to be able to make these judgments, as performance measurement is just another choice where a manager can allocate resources.

The key to effective performance measurement is knowing the target and selecting reasonable, easy-to-measure data that will clearly show progress or lack of progress. The typical department is spread so thin that the modus operandi requires swift movement from one crisis to another. But having clear, straightforward focus in performance measurement—with some level of consensus between management and staff—can provide a much-needed impetus to dedicate resources toward the improvement of specific areas of service. Our system has evolved to where we are looking at our performance deficits (simple statements of what needs to be done better), devising strategies to address the deficits and selecting measures that will evaluate the effectiveness of the strategies. This is a much more effective approach than random measurement of random data because "management" has decreed "Thou Shalt Measure Performance." By targeting our performance measurement, we invest our resources in getting to results, not in measurement alone.

Our system also includes a weekly three-hour meeting with all of the department heads (and some of their support staff) to discuss select performance deficits, strategies and trends in each of the departments. This is a huge investment of resources, but the unexpected (at least for me) consequence is that our departments are now working together to solve our collective deficits. Before we invested this time, departments could and would punt their issues to other departments ("I'd do my job, but the widgets I need are stuck in purchasing."). This is no longer a viable option, as the receiving team and the punting team are on the field together.

So, while I'm still pretty skeptical of many aspects of performance measurement, I have been very encouraged by the interdepartmental synergy that has actually improved our performance. I don't believe this has anything to do with the charts, graphs, maps, and reports—it has to do with communication, cooperation and shared accountability. The Public Works Department and the Human Resources Department are both held accountable in front of their peers for making sure the Public Works Department is adequately staffed. These departments face the City Manager together. In addition, as issues get discussed, staff from any department can chime in with ideas—resulting in a bigger brain trust to solve our community's problems.

So, please don't tell me that a crew of 2.432 persons can fill 1.2783 potholes per hour, which is up from last year when we could only do 1.2781. I don't need a multi-year analysis that proves the number of drainage complaints increases after it rains. I don't need a chart showing that our dump trucks are immortal because we use them even after they are dead. What I do want to see are managers that measure things they need to measure to improve their performance. Sounds simple? It is simple. Play the role of the taxpayer and do a qualitative evaluation of the value of what you're measuring. If Joe Taxpayer would understand it and see the benefit, then it is probably a worthwhile measurement.

If you look across the page at the performance measurement article written by Lee Feldman, you'll see the perspective of our City Manager. He graciously allows me to vent my frustrations with our performance measurement system and has encouraged me to be patient when looking for measurable results from measuring performance. Unfortunately, patience is one of my personal performance deficits, but that's the subject of a different article....

Susan M. Hann is a member of APWA's Leadership and Management Committee and chaired the committee for two years. She can be reached at (321) 952-3411 or hanns@palmbayflorida.org.

"Measuring too much is just as bad as not measuring anything." - Peter Drucker

"Never confuse motion with action." - Ernest Hemingway

"Action to be effective must be directed to clearly conceived ends." - Jawaharlal Nehru

If you want a good place to learn more about performance measurement, Fairfax County, Virginia has some excellent resources on their website at http://www.fairfaxcounty.gov/dmb/Manages_For_Results.pdf. Their "Guide to Advanced Performance Measurement" provides a very comprehensive manual on how to get the most out of performance measurement.