Categories

Archives

Meta

Author: Ross

I recently had a few days off and managed to sort out the growing collection of photographs accumulating on my hard drive. The collection is almost 150GB with 52000+ image and video files spanning 10 years. I have used a variety of photo management tools over the years including Canon software that came with the camera, FSpot, gThumb, iPhoto and Digikam (the tool of choice). The resulting mess of nested folders and sub-folders demanded some TLC. Thankfully I had a couple of backups on different disks as well as two live working copies so I was safe in case I messed up.

Enter exiftool. A command line tool to manage all aspects of your photo metadata.

I copied my collection to a scratch processing space year by year and processed them in chunks using a single line of exiftool wizardry:

This command recurses (-r) through the input directory finding all supported image and video files. It moves the files to the output folder, creating a YEAR_MONTH sub-folder (%Y_%m) using the original creation date of the file to be moved. The creation date and time (%Y-%m-%d_%H-%M-%S) is prefixed to the original filename (%%f.%%e). For each year of photos I end up with 12 folders (2005_01, 2005_02, etc.) containing all the nicely sorted photos.

Exiftool also reports errors and files it is unable to process and these remain in the input folders after processing making it simple to manually check through them. I also had some success with the remnants using the Last Modified Date.

Recently, I have been installing and configuring Geonetwork (a spatial metadata catalogue) for use at work. It can use a Geoserver installation to display maps within the user interface. By default it uses map services from across the internet but I needed/wanted to display our Ordnance Survey mapping in the viewer. Geonetwork has a config-gui.xml file for configuring the user interface and there are two sections where the mapping can be configured.

I’m using my local geoserver to serve EPSG:27700 or British National Grid tiles and I have linked that in to my Geonetwork.

A couple of things to note here: you can have as many layers as you want and you can set the display scales. You also need to set the WMS resolutions so that the map can be rendered at the appropriate scales. Use the layer options to set the resolutions and adjust the scales array to match the resolutions. I got my resolutions and scales from Geoserver where I have an EPSG:27700 gridset defined.

I recently had to update a live database with updated tables from a staging database and then continue to update on a daily basis. As it is a regular update and the source and destination tables won’t change I generated a text file with a list of layers to process and tables to write. Like this:

list.txt
srcTable1, destTable1
srcTable2, destTable2
...

The first column is the list of layers in the staging database to process. This is the %G variable in the shell script. The second column is the new table to write, the %H variable.

The initial load read in the layers from the staging database and created them in the live database. I set the progress flag to check it was doing something (this can be deleted), set the geometry column and output schema.

This can be translated as follows. For all the text files in the directory read them in as NMEA format and write them out to GPX format in the gpx folder. The modifier strips the .txt extension from the filename and writes out the input name with the .gpx extension.

Together with the good folks at thinkWhere I have been organising the first Scottish QGIs user group meeting. It is happening on 19th March 2014 in Stirling at the Stirling Management Centre. Doors open at 9:30 with a 10:00 start. Registration is through Evenbrite (https://www.eventbrite.co.uk/e/scottish-qgis-user-group-meeting-tickets-10222881915) and there are 50 places available working on a first come, first served basis.

The agenda will be published a bit closer to the time once speakers have been finalised. If you would like to present let me know as it would be good to have a mix of input to the day. There are both 20 minute and “lightning talk” 5-10 minute slots available.

A big thanks to thinkWhere for hosting this first QGIS event in Scotland and QGISUK for enthusiasm and passion.

Phew! What a whirlwind the last few weeks have been. Actually the last eight months has been hectic. Mom, Dad and Great Gran arrive tomorrow for three weeks holiday. K turns three next week with a butterfly party and the whole family. D turned five last month with an Octonauts party and Amy celebrated the last of her thirties. Kevin hit the early mid thirties. E has started smiling and laughing and also to sleep for more than three hours at a time. D has learned to use a computer mouse and to find LEGO video clips online.

There is a new climbing frame in the garden and the apple trees are over loaded with a bountiful crop this year. The garden shed needs a new roof and there is much to be done in side as well – new dishwasher, new tv, new coffee table, some painting, new carpet and much furniture shuffling to accommodate new bits and pieces.

Work is exciting with new work being done using open-source GIS application and developing the web GIS further. Just need to get over the tiredness and sleep more…

I had a request for some “spider diagrams” showing the connections between service centres and their customers and was given some sample data of about 140000 records.

The data contained a customer ID and customer coordinates and a service centre ID. Using another table of service centres I was able to add and update for each record the service centre coordinates (eastings and northings on the British National Grid EPSG:27700). Continue reading PostGIS Spiders

I have been using pgRouting for some accessibility analysis to various facilities on the network and experimenting with different ways of making the process faster.

My initial network had 28000 edges and to solve a catchment area problem for one location on the network to all other nodes on the network was taking 40 minutes on a 2.93GHz quad core processor with 4GB RAM (Windows 7 PostgreSQL 9.2 PostGIS 2.0.3 and pgRouting 1.0.7). I put the query into a looping function that processed the facilities in order but any more than 4 and the machine would run out of memory as the complete solution is stored in RAM until the loop finishes.

First step, reduce the number of edges in the network to 23000 and number of nodes to 17000 by removing pedestrian walkways, alleys, private and restricted roads. Now the query is solved in about 12-14 minutes using about 200MB RAM per facility. Continue reading Speeding up pgRouting

I am in the process of rendering a series of map tiles based on the OS OpenData products using the gdal2tiles.py script (and an updated version that uses all cores on the machine to speed things up). The different raster products are rendered at different scales and then displayed using LeafletJS and OpenLayers applications as simple demonstrations.