At school my least favourite subject was maths. I could never quite understand the point of learning how to calculate the volume of a traffic cone or surface area of a football.

Fast forward a few years to my early days in the police, and apart from the heavy responsibility of managing the tea fund, arithmetic was not a major feature of daily activity. The main exception was a thing called ‘performance measurement’, which I had heard had something to do with graphs and charts.

As a Police Constable I was more interested in catching criminals than with distractions such as performance measurement, as I felt it had little to do with my day-to-day work. My level of interest in such things was around the ‘zero’ mark (or slightly lower). During my time as a Sergeant I had to confront my statistics demons occasionally, but found the whole performance measurement scene about as exciting as waiting to see the dentist. As an Inspector there is no escape.

The silver lining to this inescapable cloud is that I have forced myself to look long and hard at performance management and this has helped me understand why I haven’t always felt comfortable with the way I have seen it implemented. I have seen (and been subject of) individual performance targets such as ‘to make three arrests per month’ etc and also witnessed sudden and apparently inexplicable shifts in centrally-determined priorities. None of this made any sense to me beforehand but now I understand why it is so wrong. I have also developed a genuine interest in understanding performance management and an appreciation of how effective systems operate.

Before I go off down that route, I want to reassure that I haven’t morphed into some sort of über-geek. Anyone who works with me will tell you that even now I am out there in the thick of things with the troops on a Friday night, and even occasionally catch criminals (I haven’t lost the knack of being in the wrong place at the wrong time). I am often first to volunteer for PSU duties such as football matches and major events, and I’m told I am a practical, common-sense sort of gaffer.

So where is this going and what has it got to do with those hours of boredom in double maths, Class 4 c, circa 1990?

Well I need you to stay with me guys because I’m going to introduce you to the high-octane world of Statistical Process Control (SPC) charts. (Also known as Process Behaviour Charts, or PBC). I appreciate there’s only so much excitement you can handle and I’m also aware that there’s some serious competition for stimulating internet content out there, but this might just explain something really important.

I found out about SPC charts a couple of years ago and it dawned on me that the reason why talk of performance measurement used to induce symptoms of pathological boredom / indifference is because so many police managers do it wrong.

How many times have you been shown a graph and been told that we have a burglary problem, or an ASB problem, or a vehicle crime problem? Pretty common huh? What about a week later, or a month later, or two months later? A pint says it’s something else that’s now the problem. Have any of you witnessed the panic caused when someone realises that crime is higher this month than it was last month? Or that detections are lower than at this time last year? It’s a shame really, because this is needless worry, as this type of comparison has absolutely no scientific basis whatsoever, plus the figures are likely to stabilise naturally.

This phenomenon is known as the ‘pinball effect’ or ‘knee-jerking’, and has been experienced in police forces up and down the country. It results in resources and effort being pinged around from one priority to the other, in response to these apparent ‘problems’ that have been identified through performance data, charts and analytical endeavour. The real problem is that some managers simply do not understand how to interpret performance data or SPC charts, and are oblivious to the damage their actions cause. Let me explain:

This is an SPC chart. It probably looks familiar. The red lines at the top and bottom are the upper and lower control limits, and the blue line is the mid-point between these. There is a formula for calculating how these limits are set, but I won’t bore you with that here. All you need to know is that as long as the statistical activity is occurring within the upper and lower control limits, the system is stable.

Data varies naturally, so this week’s figure might be slightly lower than last week, or a bit higher than the week before that. This is normal. Unless a data point falls outside one of the limits (or in certain circumstances is part of a clear trend of several consecutive points heading towards one of them), then relax. This is normal statistical variation.

Taking this quick lesson into account, the only data point that needs looking at more deeply on Chart 1 is the one at Week 3, as it falls below the lower control limit. A data point that falls outside of either of the control limits is known as a ‘signal’. If, for example, Chart 1 plotted crime rates, a police manager would want to know what happened at Week 3. It may be that there was a policing operation in that area that reduced crime significantly, or perhaps it was -15 degrees with heavy snow.

If the chart was plotting detections, then again, there would probably be a reason for the drop at Week 3. Perhaps the force’s crime inputting system was down or some staff in the detections inputting department were absent. Whatever the reason was that caused the drop, it was righted within a week and there was no long term effect, so it’s not worth worrying about too much. The important thing is to recognise that there was a definite signal at Week 3, and by asking the right questions about what might have caused it, we may discover something about the system that could be improved. Clearly then, the other data points don’t need worrying about, or reacting to. If however, there had also been similar signals at Weeks 4, 5 and 6, then that indicates there’s a definite systemic problem that requires attention. It is unlikely that other methods of comparing data would be able to differentiate between normal statistical variation and signals, which is why SPC charts are such a useful tool in providing evidence that acts as the basis to look deeper where necessary.

Now for the next part of the lesson:

Chart 2

Let’s say the above chart relates to robbery rates. The statistical activity is definitely under control, and by looking at a series of data points rather than just comparing figures with ‘what happened last week’ or ‘compared to this time last year’, we can see that there has been a steady decline in the robbery rate. Unfortunately (and no word of a lie) I have been present when, following consistent long-term reductions, an isolated and marginal increase (as at Week 10) has resulted in panic that ‘robberies are going up!’

Of course one robbery is one too many, but the way to tackle robberies is through intelligence-led policing and robust offender management, not by knee-jerking to negligible variation on a controlled SPC chart. Furthermore, if additional analysis revealed that the robberies at Week 10 occurred in different locations, at different days and times, and with no similarity whatsoever, then it is foolish to react or redeploy resources purely because there were a couple more offences than the previous week.

This is because the data point at Week 10 is neither a signal, nor part of a series. By maintaining whatever policing activity appears to have been reducing robberies recently it is entirely likely that the robbery rate would continue to decline without additional intervention.

Naturally this also works in reverse – in the event that there were less robberies this week compared to last week (or this time last year), it would be ridiculous to rest on our organisational laurels on this basis. The reality may be that there has been a long-term increase in robberies, with only a recent apparent respite. It is therefore wrong to relax based on the recent ‘respite’, especially as it is likely to be normal statistical variation rather than a genuine respite, much less a signal. The advantage of SPC charts is that they provide police managers with the opportunity to hold their nerve and see which way things develop, rather than jumping in unnecessarily.

Right, enough of the maths lesson, and back to practical application. Suppose a policing area bases its resourcing, deployment and tasking decisions on a non-event such as at Week 10 above, it absolutely guarantees inefficiency and wasted effort, as the reaction is not based on any science or logic whatsoever. Worse still, it is entirely probable that the area Commander, Sector Inspector, or Beat Sergeant will be held to account, despite this being a non-signal event.

This will result in the inevitable demand for ‘plans’ to be produced detailing what action will be taken to deal with the perceived robbery problem. The plans and any ‘results’ will subsequently be reported back centrally. This takes time and effort and is of questionable value. Resources will be diverted to something that is not a problem (in systems terms), which of course means that officers have to stop doing something they were doing elsewhere. This other activity may well have been necessary to manage genuine concerns in other areas. Of course, this increases the risk that the other concerns become a genuine problem as they are no longer controlled, and this cycle repeats itself. (Introduce arbitrary numerical targets into this mix and it gets really messy, but I won’t go there for the purposes of this blog – read my targets article if this side of things interests you).

Performance management systems are an important tool for providing information that helps managers understand the system. It enables us to identify anomalies, ask questions, and seek ways of improving the system, thereby aiming to provide a better service to the public. The first step to effective performance management is to do it properly. This means intelligently intepreting what the data mean and never relying on comparisons with just one other data point.

If the information contained within SPC charts is understood and applied intelligently it can be a powerful antidote to the wasteful knee-jerk response of reacting to isolated data and normal statistical variation. This will prevent wasteful deployments and the creation of baseless priorities, and ensure that the right things are prioritised instead. It should also stop the bizarre situation where so many conflicting ‘priorities’ are mandated that everything becomes a priority, (or to put it another way, nothing becomes a priority) and it is impossible to see the wood for the trees.

Of course, SPC charts are not the answer to everything. Effective performance management relies on considering a range of information, backed up by sound operational understanding and a holistic systemic approach. Nevertheless, SPC charts have an important place in a proportionate and meaningful performance management system, as when the information they contain is interpreted properly this provides a solid basis for understanding the system and preventing the horrible consequences of knee-jerking.

I have never found myself in a situation where I needed to work out the volume of that traffic cone, but can definitely say that maths came in useful in later life in the form of SPC charts.

So to summarise, today’s lessons:

Performance measurement, and SPC charts in particular, are useful tools if applied properly.

Insp, Can you reproduce some longer term examples of these charts? It would be good to see some (anonymous) charts over a year. For a non-force type like me, these SPC charts are new.

Can you explain the lower limits. My guess, the public will want to see the burglary (or any other issue) minimum to be zero. I will read the post again to rethink. If you can send a link to public SPC Charts, great. Mike

Hi Mike,
The lower limit is determined mathematically as opposed to an aspirational target- I’m sure we would all like to see zero crime if this was possible. I highly recommend the book “Wheeler, D.J. (2000) Understanding Variation: The Key to Managing Chaos. (2nd Ed.) Knoxville: SPC Press” which explains how the limits are generated etc. Alternatively put ‘SPC charts’ into a search engine. As for public links to charts, there’s plenty out there – it’s not just something used by the police. Hope this helps.
Simon

Really glad to see a clear understanding and explanation of management information. It’s an area I’ve worked in and you are right – too many managers get overly excited or worried about little data blips.

Another great post. I find your take on performance management really helpful (and validating of my own views). Do you fancy being my ‘performance management mentor’? I don’t work in the force, but your insights translate really well into the world of local govt performance management too.

Another interesting and informative post Simon. Thank you. I sort of feel this is where we would have been had the Govt and it’s performance culture not got in the way. There is a strong argument to leave us alone to get on with what we know.

I like John. His passion can make him a bit messianic at times, but he has helped to provide a sound theoretical framework for those of us who just know that there is a huge disconnect between the utopian fantasy that the High Heid Yins spin and sharp reality, and who need ways to challenge that gap. His “All targets create sub-optimal performance” is stark but true. His mantra that most of performance management is merely teaching people to do the wrong things better is equally sound. I also his love his unrelenting attack on ‘toolheads’.

I’ve not come across Wheeler, but two you might be interested in – simoncaulkin.com who is wonderfully withering about much of public sector management, especially the false idol of ‘managing costs’ – in his view doomed only to increase costs – when we should be managing value in order to reduce costs; and Gerd Gigerenzer whose writings on statistics and risk are both accessible and informative. I’ve found both v useful in trying to link real cause to true effect.

Thanks for the reading recommendations. I’ve read some of Simon Caullkin’s articles – ‘withering’ is a good way to describe his perfectly-aimed observations, but I’ve not heard of Gigerenzer before. I’ll definitely look up his work. Barry Loveday is another good writer when it comes to the perverse effects of numerical targets.