Read next

It is tough to feel cheerful these days about the workings of American government; with the soaring debt and political gridlock, the news from the public sector has been pretty dire. But if you want to see a tiny glimmer of cheer, take a look at a saga that has been unfolding in New York over the issue of “yellow grease”. Yes, you read that right. While politicians in Washington squabble about grandiose fiscal plans, an experiment is quietly under way in the Big Apple about how to handle the hazards posed by the waste from restaurants’ fryers (after they have cooked chips, burgers, prawn rolls and so on).

And though this so-called yellow grease problem is not something that Financial Times readers probably wish to ponder – least of all over breakfast – the tale is oddly inspirational. For it shows how cash-strapped governments can make resources go further in these challenging times, even with gritty, real-world problems.

The issue at stake revolves around how governments use data. In recent years, New York (like most public entities) has been collecting reams of information about its citizens. However, these data were split between 60 different departments, and nobody tried to co-ordinate it.

Restaurants were a case in point. City laws stipulate that New York’s 24,000 restaurants are supposed to dispose of yellow grease via licensed waste disposal companies. But many restaurants don’t want to pay disposal costs and fear dealing with organised crime (watch The Sopranos to understand why). So instead they simply tip their grease down a manhole, usually in the dead of night.

Unsurprisingly, this creates foul problems, such as blocked drains, polluted water or even fires. But in the past the city found it almost impossible to catch the grease dumpers, since there are thousands of manholes and restaurants – and the city can only afford a few inspectors. And though the city’s environmental department has dutifully tracked grease dumps for years, nobody ever tried to match this data with information on waste licences, restaurant inspections, kitchen fires – or even manholes.

But this is where computing power – or big data – comes in. Last year, two senior officials at City Hall, John Feinblatt and Michael Flowers, gathered half a dozen young data scientists together and asked these to comb through the city’s 60 different data sources. By mining the data in a co-ordinated way, they hoped to spot correlations and patterns that could predict how people might act.

Combining these databases was nightmarishly hard; in the matter of yellow grease, for example, information on pollution, manholes, restaurants’ licences and waste companies were in different (incompatible) files. But when geeks finally created a single, correlated “heat map”, the results were striking: suddenly they could spot which restaurants were likely to be dumping grease, and the inspectors’ success in catching offenders soared. So much so, that the city is now targeting the dumpers for a recycling drive. “Instead of going into the restaurants and saying pay us a $25,000 fine, we are giving them a list of registered biodiesel companies and saying stop pouring your money down the drain, knock it off,” explains Flowers. “It’s a win, win!”

Can this be copied elsewhere?

New York is trying: the Feinblatt and Flowers team is using similar techniques to catch cigarette smugglers, prevent illegal prescriptions of oxytocin, slash ambulance response times and improve fire inspections. Inspectors used to only catch illegally subdivided buildings on 13 per cent of their inspections, for example; but with data-crunching, the hit rate is now 70 per cent to 80 per cent.

“This is all about finding ways to target resources much more effectively, without extra cost,” says Feinblatt.

Other cities are experimenting too. Chicago, for example, is now using predictive data to overhaul its police department. “Using this process, governments [can] reduce costs, improve outcomes and enhance government satisfaction,” says Stephen Goldsmith, the former mayor of Indianapolis and Daniel Paul professor of the Practice of Government at the Harvard Kennedy School.

But there is a big catch. To make predictive data work, governments need to employ creative data scientists. More important still, geeks must be empowered to break down departmental silos. And while that might sound like common sense, in practice it is extremely hard, since many bureaucracies are resistant to this collaboration (particularly when jobs are being cut). Indeed, the geeks in New York only managed to do their experiments – in the face of initial opposition – because Michael Bloomberg, the mayor, has explicitly made “silo busting” a cornerstone of his tenure. And, sadly, Bloomberg is an outlier in that respect.

The story of yellow grease, in other words, shows what technology can sometimes achieve; but it also underlines why the “human factor” often makes it hard to reap the real benefits. Either way, it is something to ponder when you eat in a New York restaurant; or stumble on a manhole that (hopefully) is now free from that offending grease.