With the introduction over, the competition began, fuelled by the usual Scraperwiki promise of pizza and beer; and Amazon vouchers for the winners – who would be decided by our three judges, Sarah Hartley, editor of Guardian Local, Linda Broughton, head of nti Leeds, and Richard Horsman, associate principal lecturer at Leeds Trinity Centre for Journalism.

Five groups formed around different areas of interest, but all with a Leeds focus. Brownfield Research, by Greg Brant, Rebecca Whittington, Jon Eland and Tom Mortimer-Jones was about discovering the past, present and planned future of brownfield sites using scrapes of planning applications and change of use applications combined with web-chat and related documents. It also aimed to include history of industrial disease and accidents and contamination on site.

Find Me by software developer Marcus Houlden (@mhoulden) built a geolocation web application that displays current location, address, postcode, and links to nearest bus stops. He also started adding Yorkshire Water roadworks data.

The Leeds Pulse team scraped Live Journal data to produce a web application, built on Django, demonstrating negative and positive blogging attitudes across Leeds – drawing from 8,500 blog posts. It categorised “love, like or good” as positive, and “hate, bad or meh” as negative. The judges certainly weren’t ‘meh’ about it, and chose it as the runner-up.

Leeds Uncut, however, scooped the overall prize (team pictured above). Suzanne McTaggart, Amna Kaleem (@amnakaleem), Nick Crossland (@ncrossland), Michael Brunton-Spall (@bruntonspall) with some help from developer Martin Dunschen created a map showing the eight constituencies in Leeds to highlight how they are being affected by spending cuts and redundancies.

They also looked at job vacancies in each of the constituencies, to identify whether the creation of new jobs is offsetting the doom and gloom caused by spending cuts and job losses. Different shades of colour in the form of an “economic health thermometer” gave a visually effective overview of which constituencies are suffering the most and least.

The data for the project was gathered from job websites, news websites, the Guardian’s Cutswatch page and the Office of National Statistics, which provides figures on how many people are claiming unemployment benefit/jobseekers allowance each month, giving an indication of the number of new redundancies.

The three judges … were unanimous in deciding that the worthy winners had successfully collated trusted data and compiled an easy to use map visualisation.

£250 worth of Amazon.co.uk vouchers will be split up among the winners and runners up. An extra prize for the best scraper work, chosen by Scraperwiki’s Julian Todd, went to Matt Jones, who will continue to maintain the planning data scraper.

With thanks to all our sponsors and helpers mentioned above, and additionally Leeds Trinity’s Catherine O’Connor and Imran Ali.

Twitter conversation was via the #hhhleeds tag, and see below for a visualisation of some of the geotagged tweets (courtesy of remote onlooker Tony Hirst, @psychemedia):

More links to be added as we spot them and photographs are coming… Please email judith at scraperwiki.com with more material, or leave links in the comment section below. I’d especially like to add in links to scrapers and data sets, so people can see how the projects were built.

Want to get involved? We’re still on tour! If you’d like to sponsor an event please get in touch with aine@scraperwiki.com.