For our first venture to Scotland where better to be than BBC Scotland! We had 8 teams of hacks and hackers digging around the Scottish data beat. For this very special occasion the ScraperWiki digger has donned tartan! With this special digger, fire incidents, planning applications, public-owned property and gifts councillors’ received have been mined. Here’s a word from our Francis Irving:

Central Scotland Fire Service put a lot of data on their website but as is usual, it was not in a very useful form. 60 incidents are put on the site but if you dig down you get over 15,000 buried records. For one day’s work, Fire Bugs scraped the records and decided to look at malicious false alarms. Luckily for them the language and structure of the records were consistent. They found that 3.5% of all calls were malicious false alarms. They even made a tree map on ScrpaerWiki using protoviz. Fire Bugs have clearly opened up a huge amount of potential with this data.

Edinburgh Planning App Map – This is Edinburgh’s first automated map of local planning applications! This is a popular theme for our hack days and on ScraperWiki in general. Open Australia are using ScraperWiki for their planning alerts.

As Michael MacLeod pointed out, people dont’ know how to use the local council website. You can’t just type in your postcode to find applications near you. There’s a map online but it’s truly awful! The team scraped the site and made a map which updates everyday rather than just every week like on the council site. Michael used this new tool to take a closer look at his beat and found a planning application for urban paintball. What he duly noted was that the Facebook page was trying to be secretive about the location! Using the map, he found it was going to be right behind a block of flats. He wil be talking to residents!

Hide by the Clyde – This project creates a map to allow the user to compare exam results in different areas and correlates this against measures of social deprivation.

Public Buildings for Sale – is a tool to show all publicly-owned property that is for sale/rent. This will be a Scottish sister to ScraperWiki’s brownfield’s sites map. The project aims to answer the question: How much public land is being sold without our knowledge?

In just one day they managed to built a prototype for a “Mind how you go!” BBC Scotland site. They used road traffic accident reports based on 2005-2009 data to create a form that showed how likely you were to survive your journey depending on your age, sex and where you’re going! They built a spreadsheet from even more data so that the site had even more potential to go beyond the records. They also scraped Google searches of reported road traffic accidents and mapped the reports from BBC scotland from 2010.

BME Scotland – This project aims to find out what are the effects of the recession on education! Is education a route to the ghetto? It aims to compare BME educational achievement with unemployment statistics to find out which areas of Scotland are economic no-gos.

The lesson learnt here was that sometimes there’s not enough data to go around. What they know is that the African population is doing exceedingly well in education in Scotland. However, they also have a relatively high level of unemployment. The result was a call out for better data collection as none of the information fitted in a way that would help answer the question: Why?.

As it turner out, the Council website is easy to scrape. The structure of the site is consistent and clean. ScraperWiki likes this! And so here is the scraper. As James pointed out, the data needs to be double checked for misspelled entries, etc. But the preliminary data shows that Lothian buses gave the most gifts and Phil Wheeler received the most gifts.

Magners Cider – This project aims toscrape the Magners League Rugby scores. The team consisted of a hack/hacker pair of Paul McNally (again, we love eager hackers!) and Tony Sinclair of BBC Scotland (who had to keep up the day job and so was not around for a picture). Apparently, a graphics operator had to input the information from the site by hand into the graphics system to produce the league tables you see on screen. Seeing as the graphics software can access spreadsheets, Tony thought “Why not automate the process by scraping?”. And this is what they did. So the scores have gone from ScraperWiki to TV!