Planning Alerts: first fruits

[I won’t go into the details, but suffice to say our internal deadline got squeezed between the combination of a fast-growing website, the usual issues of large datasets, and that tricky business of finding and managing coders who can program in Ruby, get data, and be really good at scraping tricky websites.]

But I’m pleased to say we’ve now well on our way to not just resurrecting PlanningAlerts in a sustainable, scalable way but a whole lot more too.

Where we’re heading: a open database of UK planning applications

First, let’s talk about the end goal. From the beginning, while we wanted to get PlanningAlerts working again – the simplicity of being able to put in your postcode and email address and get alerts about nearby planning applications is both useful and compelling – we also knew that if the service was going to be sustainable, and serve the needs of the wider community we’d need to do a whole lot more.

Particularly with the significant changes in the planning laws and regulations that are being brought in over the next few years, it’s important that everybody – individuals, community groups, NGOs, other websites, even councils – have good and open access to not just the planning applications in their area, but in the surrounding areas too.

In short, we wanted to create the UK’s first open database of planning applications, free for reuse by all.

That meant not just finding when there was a planning application, and where (though that’s really useful), but also capturing all the other data too, and also keep that information updated as the planning application went through the various stages (the original PlanningAlerts just scraped the information once, when it was found on the website, and even then pretty much just got the address and the description).

Of course, were local authorities to publish the information as open data, for example through an API, this would be easy. As it is, with a couple of exceptions, it means an awful lot of scraping, and some pretty clever scraping too, not to mention upgrading the servers and making OpenlyLocal more scalable.

Where we’ve got to

Still, we’ve pretty much overcome these issues and now have hundreds of scrapers working, pulling the information into OpenlyLocal from well over a hundred councils, and now have well over half a million planning applications in there.

There are still some things to be sorted out – some of the council websites seem to shut down for a few hours overnight, meaning they appear to be broken when we visit them, others change URLs without redirecting to the new ones, and still others are just, well, flaky. But we’ve now got to a stage where we can start opening up the data we have, for people to play around with, find issues with, and start to use.

For a start, each planning application has its own permanent URL, and the information is also available as JSON or XML:

There’s also a page for each council, showing the latest planning applications, and the information here is available via the API too:

There’s also a GeoRSS feed for each council too allowing you to keep up to date with the latest planning applications for your council. It also means you can easily create maps or widgets for the council, showing the latest applications of the council.

Finally, Andrew Speakman, who’d coincidentally been doing some great stuff in this area, has joined the team as Planning editor, to help coordinate efforts and liaise with the community (more on this below).

What’s next

The next main task is to reinstate the original PlanningAlert functionality. That’s our focus now, and we’re about halfway there (and aiming to get the first alerts going out in the next 2-3 weeks).

We’ve also got several more councils and planning application systems to add, and this should bring the number of councils we’ve got on the system to between 150 and 200. This will be an ongoing process, over the next couple of months. There’ll also be some much-overdue design work on OpenlyLocal so that the increased amount of information on there is presented to the user in a more intuitive way – please feel free to contact us if you’re a UX person/designer and want to help out.

We also need to improve the database backend. We’ve been using MySQL exclusively since the start, but MySQL isn’t great at spatial (i.e. geographic) searches, restricting the sort of functionality we can offer. We expect to sort this in a month or so, probably moving to PostGIS, and after that we can start to add more features, finer grained searches, and start to look at making the whole thing sustainable by offering premium services.

We’ll be working too on liaising with councils who want to offer their applications via an API – as the ever pioneering Lichfield council already does – or a nightly data dump. This not only does the right thing in opening up data for all to use, but also means we don’t have to scrape their websites. Lichfield, for example, uses the Idox system, and the web interface for this (which is what you see when you look at a planning application on Lichfield’s website) spreads the application details over 8 different web pages, but the API makes this available on a single URL, reducing the work the server has to do.

Finally, we’re going to be announcing a bounty scheme for the scraper/developer community to write scrapers for those areas that don’t use one of the standard systems. Andrew will be coordinating this, and will be blogging about this sometime in the next week or so (and you can contact him at planning at openlylocal dot com). We’ll also be tweeting progress at @planningalert.

13 Responses

Is there any value in listing the local authorities who you are able to scrape/acquire the data from. Highlighting those who do make data available, compared to the those who don’t is a good driver. The way that OpenlyLocal rated LA’s openness really worked and there was many a presentation given by compliant LAs in front of non-compliant ones, which am sure had some effect – even if it creates a conversation between them as to ‘how did you do that?’

Chris…you’ll need to speak to Mike (below) from Astun on that!! Our feed is comprehensive though and covers all (valid) applications for the last 90 days. We have been through a few iterations of SQL to ensure that nothing is missed from the feed and that it does indeed represent a true picture of planning apps.

James
That’s good to know, and should mean that the GeoRSS feed can be used in this case (although we’d need a way to get from the RSS items to the complete entries). This could be done by using the URL of the entries in the link element for the items. By the way, that URL appeared to be an HTML representation rather than RSS feed, unless I misunderstood. Which is entirely possible 😉

We’re using the GeoRSS feed for Lichfield for the basic existence of an planning application, and then using their XML api to get the details. However, GeoRSS is not great, because it only shows you the most recent XX entries, and depending on how many planning applications there are a day, and what order the GeoRSS feed is in (date received, date validated, date updated, date visible to the public) new items might not appear on it, or might get missed by someone using it as a primary source for the data. Would be better to be able to get all the applications between a range of dates as XML or JSON

\

You could put them on a single page to be scraped, or via an API. Or it might be just easier to email them to planning at openlylocal dot com 😉

Not yet, although it would be great if we (the community) could do with standardise the terms, and then formalise that into a vocabulary. We’ve come up with a list of core terms, based on looking at many of the councils we’ve looked at so far and Andrew will be blogging about that soon. I’m guessing you’ve done some standardisation too. Would be great to get your take on it.

[…] Chris wrote in his last post announcing OpenlyLocal’s progress in building an open database of plann…, while we can do the importing from the main planning systems, if we’re really going to cover […]

When we rebuilt the openly local map we knew that there were hundreds of hyperlocal sites, Facebook pages and Twitter accounts out there that weren’t on the map. And that the map database contained hundreds of sites that had ceased publishing as natural wastage since 2009 in a fast moving sector where people are testing … Continue reading Archiving sites […]

The old OpenlyLocal map only had about 5 sites for Devon, mainly the defunct ‘Local People’ franchise. This couldn’t be right for England’s second largest county. Devon has a strong tradition of local pride in place and its rural nature meant there are obvious advantages to having websites for local civic life. As well as a … Continue reading Devon and Somer […]

Living and working in rural Aberdeenshire, where I run my own hyperlocal site, I am the official Scottish correspondent for Talk About Local. Following on from Will’s post about filling in the white space around Wakefield on the Local Web List map I thought I’d share my experiences adding in sites from Scotland. Way back … Continue reading Finding Scottish h […]

We’ve updated the search on the site so it includes meta data. WordPress doesn’t search this data, tags & categories, out of the box so we’ve added it in. So if you want to search for sites built on a certain platform type the platform in to the search box and hit enter, the same … Continue reading Updated Search

Carnegie UK Trust is supporting Talk About Local in improving the map of UK hyperlocal sites. Despite having hundreds of entries, the map when we rehosted it had some curious white spaces – it can’t be right that Devon only had five sites listed nor that towns such as Wakefield had none, nor Scotland about … Continue reading How to approach white spaces in […]

Email Subscription

Enter your email address to subscribe to this blog and receive notifications of new posts by email.