Sparkgeohttps://www.sparkgeo.com/blog/feeds/atom/2018-07-26T18:11:07+00:00Telling stories using maps and the internetVector Tile Server using PostGIS2018-07-26T18:11:07+00:00James Banting/blog/author/jamesbanting/https://www.sparkgeo.com/blog/vector-tile-server-using-postgis/<p>We recently held a hackathon at Sparkgeo where we split into a couple teams and turned an idea we had been toying with, but never actually had the time to implement, into a working product.&nbsp;Joe Burkinshaw,&nbsp;Parmvir Thind, and myself were team MVT-MVP <em>aka</em> Map-Zuckerberg.</p>
<p>Our colleague and CTO, Dustin Sampson, was musing about creating vector tiles on the fly, directly from PostGIS. We thought this was a perfect task for a hackathon so we went about implementing it using a few different technologies. Our team has great experience individually with a variety of technologies, but we wanted to get out of our element and try tech that we haven't worked with in depth. So each team member picked an area (devops, data transformation, database administration, etc) where we did not have a lot of experience and went about to change that.</p>
<p>In the initial discussions, we decided that we wanted to use PostGIS, MapboxGL and AWS Lambda/API gateway as the main components of our tileserver. This would be more of a backend focused project so we didn't go too detailed with cartographic style or data presentation. Function over form for this project.</p>
<h2><strong>Data!!</strong></h2>
<p>Joe Burkinshaw&nbsp;was able to grab some OSM data and load it into PosgreSQL. Using the PostGIS extension and relatively new ST_AsMVT function, he setup a query that returns a mvt tile, given an {x} {y} {z} url parameter.</p>
<p>Initially, we setup a PostGIS instance on AWS RDS. It was super easy to get up and running, however we ran into an issue due to a protobuf C library not being included with the instance. This library is essential for the PostGIS' vector tile creation function "ST_AsMVT".</p>
<p>This is a known issue and it may change in the future. Check for any progress here:</p>
<ul>
<li><a href="https://forums.aws.amazon.com/thread.jspa?threadID=277371">https://forums.aws.amazon.com/thread.jspa?threadID=277371</a></li>
<li><a href="https://dba.stackexchange.com/questions/202049/libprotobuf-c-support-on-amazon-rds">https://dba.stackexchange.com/questions/202049/libprotobuf-c-support-on-amazon-rds</a></li>
</ul>
<p>Instead what we did was got the PostgreSQL instance up and running on EC2 and just managed it from there. The steps are&nbsp;documented below.</p>
<ol start="1">
<li>Get an AWS EC2 instance up and running with Ubuntu Server 16.04 as the OS: <a href="https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/launching-instance.html">https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/launching-instance.html</a></li>
<li>Connect to the EC2 instance with SSH. If you have not done this before, see the following steps.<ol start="1"><ol start="1">
<li>Install AWS Command Line Interface (the bundle): <a href="https://docs.aws.amazon.com/cli/latest/userguide/installing.html">https://docs.aws.amazon.com/cli/latest/userguide/installing.html</a></li>
<li>Configure AWS CLI: <a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html">https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html</a></li>
<li>Connect to EC2 instance with SSH: <a href="https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AccessingInstancesLinux.html">https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AccessingInstancesLinux.html</a></li>
</ol></ol></li>
<li>A quick way to get PostGIS up and running is with a docker container.<ol start="1"><ol start="1">
<li>Install Docker on EC2 instance. <a href="https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-16-04">This guide</a> is useful.</li>
</ol></ol></li>
<li>Create a Docker container with PostGIS (based on something from <a href="https://alexurquhart.com/post/set-up-postgis-with-docker/">this guide</a>):</li>
</ol>
<pre>docker volume create pg_data <br />
docker run --name=postgis -d -e POSTGRES_USER=postgres -e POSTGRES_PASS=&lt;password&gt; -e POSTGRES_DBNAME=mvt_mvp -e ALLOW_IP_RANGE=0.0.0.0/0 -p 5432:5432 -v pg_data:/var/lib/postgresql --restart=always kartoza/postgis:9.6-2.4</pre>
<p>Great, we have our database setup and ready for use.</p>
<p>Next step is to add some data for testing. We grabbed OSM building data for BC from&nbsp;<a href="https://github.com/omniscale/imposm3">Geofabrik</a>&nbsp;and used a command line tool to add the data to PostGIS. We ended up using&nbsp;<a href="https://github.com/omniscale/imposm3">imposm3</a>to achieve this, but other tools like osm2pgsql and ogr2ogr will work with the source/destination. Now that we have our data set up, we need to provide a means of serving that data as vector tiles.&nbsp;These steps are outlined below.</p>
<ol start="1">
<li>Add <a href="https://www.mapbox.com/help/postgis-manual/#add-our-postgis-vector-tile-utility">Mapbox PostGIS vector tile utility</a> to PostGIS. This will provide a function called 'TileBBox' which is required to determine the extent of tiles from x and y coordinates and zoom level):&nbsp;</li>
</ol><ol>
<li>Create a new function in PostGIS (PgAdmin is useful for this). Note that the data is returned as a byte array - this is important for the AWS Lambda side of things:</li>
</ol>
<pre data-syntaxhighlighter-params="brush: java; gutter: false; theme: Confluence" data-theme="Confluence">CREATE OR REPLACE FUNCTION mvt_mvp_buildings(x integer, y integer, zoom integer, out mvt bytea) RETURNS bytea
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; AS $$
SELECT ST_AsMVT(q, 'buildings', 4096, 'geom')
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; FROM (
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; SELECT
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; id, name, type,
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; ST_AsMVTGeom(
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; osm_buildings.geometry,
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; TileBBox(zoom, x, y),
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 4096,
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 256,
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; false
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; ) geom
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; FROM osm_buildings&nbsp;
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; WHERE osm_buildings.geometry &amp;&amp; TileBBox(zoom, x, y)
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; AND ST_Intersects(geometry, TileBBox(zoom, x, y))
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; ) q;
$$ LANGUAGE sql;</pre>
<p>And just like that you have a function that returns vector tiles. This is great, we have the backbone of our tileserver!</p>
<p>Unfortunately direct database requests are not the easiest to manage when building a url, and setting up a full blown tileserver can be a bit overkill if your project only requires one layer. We need a slimmer way to serve our data.</p>
<h2><strong>Front end</strong></h2>
<p>While Joe Burkinshaw was crushing the db admin side of things,&nbsp;Parmvir Thind set up a map for us to view our vector tiles in using Angular 4&nbsp;and MapboxGL. He's a whiz at the front end and got that up in no time. Now we can show our beautiful vector tiles!</p>
<h2><strong>Enter the Lambda tile server and API Gateway.</strong></h2>
<p>This process isn't new. In fact, <a href="https://hi.stamen.com/stamen-aws-lambda-tiler-blog-post-76fc1138a145">stamen</a> did a post about it a couple years ago. In their project, they were serving out raster tiles and using lambda to do the styling. The beauty of vector tiles is that the client can control the styling and pretty much all facets of geospatial data manipulation. Lambda for the vector tiles simply acts as a way to set up a very simple geospatial api. The cool thing with the Lambda api is that it can include basic authentication as well. It's extremely customizable. Stamen's article gives a great technical overview of building a lambda function for geospatial processing and there are bunch of others as well. <a href="http://www.perrygeo.com/running-python-with-compiled-code-on-aws-lambda.html">Matt Perry's</a> tutorial and <a href="https://blog.mapbox.com/combining-the-power-of-aws-lambda-and-rasterio-8ffd3648c348">Vincent Sarago's</a> tutorial to name a few</p>
<p>After Parmvir Thind got the front end up and running, he jumped over and helped me with setting up a lambda function using python 3.5. We set up our connections to Joe Burkinshaw's database and didn't have any issues, we could make queries and see data coming back. Perfect.</p>
<p class="auto-cursor-target">Not so perfect was the fact that we could not get the mvt binary to come through Lambda and API Gateway without being converted to a string. Every request through API gateway was met with the byte literal prefix 'b' attached. Our tiles were not being served as MVT but instead of as string literals. The mapping libraries (we tried all the big ones) could not make heads or tails of this data. Super annoying.</p>
<h2><strong>Wha-Wha</strong></h2>
<p class="auto-cursor-target">The final morning of the hackathon rolled around and we could only present that we almost made it. We had a gameplan, we had a working vector tile creator, what we didn't have was enough time to implement the serving part of it.&nbsp;.</p>
<p class="auto-cursor-target">The hackathon concluded and we went back to working on helping clients implement solutions to their problems.&nbsp;</p>
<p>....<br /> ....<br /> ....</p>
<p class="auto-cursor-target">This project kept nagging at us in the following weeks and so we decided to try a different route. Instead of python we would use Node.js.&nbsp;Parmvir Thind is great with Node and so set about to writing up the program we would need.</p>
<h2><strong>Setting up a Lambda Function with Node.js</strong></h2>
<p>We needed a way to talk to the Postgres database. Lo and behold, there's a package for that.</p>
<p><strong>package.json</strong></p>
<pre data-syntaxhighlighter-params="brush: js; gutter: false; first-line: 1; theme: Confluence" data-theme="Confluence">{
&nbsp; "name": "node_postgres",
&nbsp; "version": "1.0.0",
&nbsp; "description": "node postgres api",
&nbsp; "main": "index.js",
&nbsp; "scripts": {
&nbsp;&nbsp;&nbsp; "test": " exit 1",
&nbsp;&nbsp;&nbsp; "deploy": "&mdash; zip-file fileb://Lambda-Deployment.zip",
&nbsp;&nbsp;&nbsp; "predeploy": "zip -r Lambda-Deployment.zip * -x *.zip *.log"
&nbsp; },
&nbsp; "keywords": [
&nbsp;&nbsp;&nbsp; "postgres"
&nbsp; ],
&nbsp; "author": "Peter Hanssens",
&nbsp; "license": "ISC",
&nbsp; "dependencies": {
&nbsp;&nbsp;&nbsp; "pg": "^6.1.2"
&nbsp; }
}</pre>
<p>Now that we have the ability to talk to Postgres, let's see if we can get javascript to return the binary data. After adding the code below alongside the package.js file, we did&nbsp;<code>npm install</code> and <code>npm run predeploy</code> to create a .zip file of the lambda function. This zip file is what gets loaded up to lambda. We used Node.js 8 as our runtime.</p>
<p>The function grabs {x} {y} {z} parameters from API gateway and uses those to query the database.&nbsp;The callback returns a base-64 encoded string (this is how javascript deals with binary) that contains the vector tiles.&nbsp;</p>
<p><strong>index.js</strong></p>
<pre data-syntaxhighlighter-params="brush: js; gutter: true; first-line: 1; theme: Confluence" data-theme="Confluence">var pg = require('pg');
var b;
&nbsp;
exports.handler = function (event, context, callback) {
&nbsp; var conn = `postgres://&lt;username&gt;:&lt;password&gt;@&lt;ip&gt;/&lt;db&gt;`;
&nbsp; var client = new pg.Client(conn);
&nbsp; client.connect();
&nbsp;
&nbsp;&nbsp;var z = event.pathParameters.z.split('.');
&nbsp; var query = client.query('SELECT mvt_mvp_buildings(' + event.pathParameters.x + ',' + event.pathParameters.y + ',' + z[0] + ');');
&nbsp;
&nbsp;&nbsp;query.on('row', function (row, result) {
&nbsp;&nbsp;&nbsp; result.addRow(row);
&nbsp; });
&nbsp;
&nbsp;&nbsp;query.on('end', function (result) {
&nbsp;&nbsp;&nbsp; b = new Buffer(result.rows[0].mvt_mvp_buildings, 'binary');
&nbsp;&nbsp;&nbsp; var s = b.toString('base64');
&nbsp;&nbsp;&nbsp; callback(null, {
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; statusCode: 200,
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; headers: { 'Content-Type': 'application/octet-stream', 'Access-Control-Allow-Origin': '*' },
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; body: s,
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; isBase64Encoded: true
&nbsp;&nbsp;&nbsp; });
&nbsp;&nbsp;&nbsp; client.end();
&nbsp; });
};</pre>
<p>We pointed the frontend to our API gateway running the lambda function and...</p>
<h2><strong>Success!!!</strong></h2>
<p>We have vectors beings served through Lambda and ST_AsMVT!</p>
<p>This was a fun experiment for all of us as we were exposed to different development areas which we normally do not work in. Out tile server has not been stress tested and is a proof of concept only.</p>
<p>That being said, we had a lot of fun working on this project and hope that you are able to find some value out of our trials and tribulations.</p>
<p><img src="https://s3-us-west-2.amazonaws.com/sparkgeo-blog-posts/MVT/Hackathon_MVT.gif" alt="MVT_MVP" width="680" height="350" /></p>MERMAID2018-06-08T13:29:49+00:00Dustin Sampson/blog/author/dustin/https://www.sparkgeo.com/blog/mermaid/<p style="margin-bottom: 0;"><img src="/static/media/uploads/image.jpg" alt="" width="500" height="284" /></p>
<p style="font-size: 0.8em; margin-top: 0;">Photo credit: James Morgan / WWF.</p>
<p>Today is world ocean day (<a href="http://www.worldoceansday.org/">http://www.worldoceansday.org/</a>). A day when we are encouraged to think more deeply about our seas, oceans and the creatures and habitats that reside under the waves. Really, a single day isn&rsquo;t nearly enough; we should be thinking about our impact on our oceans a lot more than that.</p>
<div>Luckily, there are teams of dedicated scientists doing just that. At Sparkgeo we have been lucky enough to work with one such team. Today, the Wildlife Conservation Society in collaboration with the World Wildlife Foundation launched MERMAID, a&nbsp;Marine Ecological Research Management AID.&nbsp; Together we have built a platform to support the capture of marine data in remote maritime environments.</div>
<div>&nbsp;</div>
<div><iframe src="https://www.youtube.com/embed/Wf45lJ4FMpU" width="533" height="400"></iframe></div>
<div>
<div>&nbsp;</div>
<div>We&rsquo;ve been lucky enough to work with the amazing WCS team for the last four years on a variety of impactful environmental projects.</div>
<div>&nbsp;</div>
<div>
<div>Checkout MERMAID:</div>
<div><a href="https://datamermaid.org/">https://datamermaid.org/</a></div>
<div>&nbsp;</div>
</div>
<div>Find out more about the MERMAID project:</div>
<div><a href="https://medium.com/wcs-marine-conservation-program/technology-can-help-pick-up-the-pace-to-protect-coral-reefs-4d12608cd4fc">https://medium.com/wcs-marine-conservation-program/technology-can-help-pick-up-the-pace-to-protect-coral-reefs-4d12608cd4fc</a></div>
<div>&nbsp;</div>
</div>
<div>
<div>Find out more about the&nbsp;Wildlife Conservation Society:</div>
<div><a href="http://wcs.org">http://wcs.org</a></div>
<div>&nbsp;</div>
<div>Find out more about the World Wildlife Fund:</div>
</div>
<div><a href="https://www.worldwildlife.org/">https://www.worldwildlife.org/</a></div> COG - huh! What is it good for?2018-06-04T09:00:00+00:00James Banting/blog/author/jamesbanting/https://www.sparkgeo.com/blog/cog-huh-what-is-it-good-for/<p>Absolutely&hellip; well, a lot actually.</p>
<p>Cloud Optimized GeoTIFF (COG) are geotiff files, like satellite imagery, that have been processed in a manner that makes it easy to consume for internet processing applications (read cloud). It is the brainchild of some very smart people at very smart organizations (Amazon, Planet, MapBox, ESRI, USGS and the landsat-pds mailing list) and is supported by major processing software such as ESRI, Rasterio, Geoserver, QGIS and GDAL.</p>
<p>COG&rsquo;s are regular old geotiff files that we would find in the GIS and Remote Sensing industry. The difference is that the file is optimized by pre-processing of the original image into tiles and overviews, similar to how pyramids work when viewing geotiffs in GIS software. It&rsquo;s different from traditional mapping tile servers since those convert raster data into scaled RGB values, effectively losing the spectral (number of bands) and radiometric (bit depth of data) resolution. COG&rsquo;s preserve that data and allow us to perform analysis as if we had downloaded the raw dataset.</p>
<p>COG is not a new format, instead it is a new implementation of a relatively old format. It is roadmap for how best to serve raster data over the internet. It saves us from downloading an entire image if we are only interested in one small area. The caveat, and this is highlighted in @geo_will&rsquo;s <a href="/blog/image-be-gone/" target="_blank">post</a>, is that we still need to know the image we want data from. That is left as an exercise to image providers to search their catalogs using Areas Of Interest (AOI) instead of unique ID&rsquo;s. However, once we have that data, processing is much easier. I see this as a great first step into an imageless world - pixel paradise.</p>
<p>Satellite image providers, DigitalGlobe and Planet already support COG&rsquo;s for accessing their data archives.This is great news for us image consumers if we want to run our applications at web scale. We need access to more imagery in order to provider more and better insight into our ever changing world. That access should come in the form of COG. From image chips for training Machine Learning models to temporal analysis of AOI&rsquo;s, COG provides the best avenue for accessing a diverse range of satellite images quickly and efficiently.</p>
<p>This is an exciting time for remote sensing professionals. We no longer have to tote around multiple external hard drives of our image catalog or perform massive data transfers to do analysis. We are moving towards ready access to imagery to help answer the questions we have.</p>
<p>I, for one, welcome our new COG overlords.</p>
<p>References:<br /><a href="http://www.cogeo.org/" target="_blank">COG website</a><br /><a href="https://trac.osgeo.org/gdal/wiki/CloudOptimizedGeoTIFF" target="_blank">GDAL COG</a></p>Image Be Gone2017-10-19T17:27:55+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/image-be-gone/<blockquote class="twitter-tweet" data-lang="en">
<p lang="en" dir="ltr">The image is to modern remote sensing what assembly language is to your laptop. Important, but you shouldn&rsquo;t (need to) interact with it.</p>
&mdash; Will Cadell (@geo_will) <a href="https://twitter.com/geo_will/status/918158506942898176?ref_src=twsrc%5Etfw">October 11, 2017</a></blockquote>
<script charset="utf-8" type="text/javascript" src="http://platform.twitter.com/widgets.js"></script>
<p>&nbsp;</p>
<p>Images have been the currency of remote sensing since we started looking down at our Earth with sensors. Whether we were looking at stereo pairs for classifying stands of forest, or using LandSat images for the measuring our changing landscape the image has been an infrastructural necessity. Indeed, how on Earth would one do remote sensing without an image?</p>
<p>A typical work-flow would be to order images from a government person on the phone and wait for their delivery for manual interpretation, or you might hit up an FTP site and wait a few hours while images downloaded. This was all quite reasonable because the digital form of images has always been "big" in terms of file size and thus bandwidth. Due to this, imagery has always taken a fair bit of management and of course disk space. I am sure most self identifying GIS* people reading this will have a (large) number of external hard drives laying around with archived projects containing in the most part images, or derivatives of images, or derivatives of derivatives of images.</p>
<p>The fact that the image, as a convenient data construct is painfully inadequate is just irrelevant. Images are a reality, like timezones or taxes. They are just a necessary transaction to access what you really want: the pixels, the data they hold, and the insight they might reveal.</p>
<p>Hard-copy is one thing, but even assuming that our images are in fact digital, we still have enormous difficulty in moving them around. We've built mythologies around download times and file sizes, the valiant analyst processing late into the night crunching data; the abilities of some to build more optimal network appliances for quicker access. This battle was constantly being fought against a panorama of higher and higher resolution imagery, with their ballooning file sizes. Moore's law was helping but in the end bandwidth simply hasn't kept up. Indeed now we have cloud computing which solves much of this problem, except the data must still be moved from one cloud to another, and so often we are moving pixels we don't even care about.</p>
<p>Because file size isn't the only problem with the image. Another reason is that an image has an arbitrary size. An image has at least four geospatially relevant resolutions. Pixel, spatial, temporal, and spectral resolutions all play their part. Spatial, temporal and spectral resolutions are all characteristics of the platform capturing the data. A satellite or aircraft has a particular sensor and flies at a particular height. These features determine a platform's fitness for purpose.</p>
<p>The pixel resolution of an image, however it completely arbitrary in the face of the images purpose. Sure, there is typically a relationship between the pixel width of a sensor and an image's width. If a sensor is push-broom then the image length is typically determined by the "convenience of data transfer"; satellite bandwidth is limited, so data is sent down in chunks. If the satellite sensor is a more traditional camera based system, then length is determined like width by the sensor's dimensional characteristics. But my point is that this characteristic of an image does not support its fitness for purpose, it simply defines an arbitrary area of capture for the convenience of data capture and transfer.</p>
<p>But once that image is on the ground, why do we insist on it still being an image? The fact that an image has pixel x and y characteristics is largely due to the nature of the sensor itself, and not respective of the environment we seek to measure.</p>
<p>That a image has a pixel resolution isn't really the issue. It is that we have to consider an image having an edge at all, or indeed that an image exists at all.</p>
<p>So, the issue is more that we have had to consider an area that is not of our own design, rather than our actual area of interest. The reason we have had to do that is because the pixels we want to look at belong to a construct of somewhat arbitrary size. The image is entirely irrelevant to our work-flow. The fact is we care about the pixels; we care about the reflectance values or sensory measurement of a particular piece of our planet; we care about location.</p>
<p>I feel deeply lucky and humbled to be part of our present geospatial age. Every few weeks we see new advancements in technology, whether its Machine Learning here or micro-sats there we have been witnessing the renaissance of remote sensing. This domain of dusty old aerospace boffins is emerging as as both an increasingly incredible source of information about our planet and one of the "hottest" sectors for technology investment.</p>
<p>One of modern remote sensing's quiet macro trends is the incremental erosion of the image as the transactional currency of remote sensing, and I am rejoicing!</p>
<p>With both DigitalGlobe's GBDX platform and Planet's API are moving towards an image-less society, soon it will be possible to simply request the pixels we want based on a snippet of Well Known Text rather than having to access a strip or image. Accessing the 300 pixels I want rather than the 30GBs of image I don't want. As we iterate over these ideas, we will see an evolution of images becoming imagery, and imagery being a pervasive globe of multi-temporal, multi-spatial, multi-spectral pixels from which we, as analysts will be able to pick and choose the wavelengths of interest, the dates of interest, the spatial resolution of interest all based on a location.</p>
<p>Say goodbye to the days of picking images from catalogs.</p>
<p>Say goodbye to having to download four massive images because your AOI happens to be in just the wrong place.</p>
<p>Say hello to creating data products based on location.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>*But (Oh my word!) what is GIS!?</p>Disruption in the Geospatial Age (pt2)2017-03-08T23:57:47+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/disruption-in-the-geospatial-age-pt2/<p>Continuing from <a href="/blog/disruption-in-the-geospatial-age-pt1/">disruption in the geospatial age pt 1</a>&nbsp;where we talked about classic disruption business models and in particular referenced Boundless, Esri and Mapbox. I wanted to highlight another example describing how Planet and DigitalGlobe have been reacting to a highly complex technology &amp; consumer environment.</p>
<p><strong>DigitalGlobe and Planet</strong></p>
<p>DigitalGlobe and Planet are another interesting example of potential geospatial disruption. DigitalGlobe (DG) is a hugely successful satellite imaging company. Largely supported through significant US government contracts to supply high-resolution imagery, DG provides the world with beautiful 30cm resolution imagery. This is a very high end product, and as a result somewhat expensive and highly sought after.</p>
<p>Planet has been launching a constellation of doves, which are much smaller but more numerous satellites than any of DG&rsquo;s flagship WorldView series. Indeed, when the cost of a WorldView is measured in hundreds of millions and Planet&rsquo;s Doves are estimated to be in the hundreds of thousands each, there is a significant cost differential. Even with the different expected lifecycles.&nbsp; In essence Planet&rsquo;s Doves are consumable. By design they fall out of orbit more frequently and at first provided a significantly lower resolution, lower image quality product. But, in no uncertain terms this has been an intentionally disruptive play by Planet. Indeed, this is underlined by their planned ability to provide &ldquo;every day repeat coverage of every where on earth&rdquo;. That&rsquo;s a very big statement in the somewhat dusty remote sensing world. However compelling, it&rsquo;s not absolutely clear if this product has a market yet.</p>
<p>Observing the sparring between DG and Planet over twitter however, it is clear that each company has been seeking to differentiate. Planet talking about their ability to monitor and Digital Globe has been highlighting their ability to provide much higher resolution imagery, focusing on that &lsquo;high end&rsquo; market. Planet is &ldquo;finding things&rdquo;, and DG is &ldquo;taking a closer look&rdquo;.</p>
<p>This week Planet has launched another 88 doves and they continue to innovate on their sensor, eeking out better performance with each launch. If DG had done nothing in response it would seem that they could easily be subject to a classic disruption.</p>
<p>However, DG has been building an overt response in the form of GBDX, or their &ldquo;Geospatial Big Data&rdquo; platform. Instead of focusing on the pixels, DG wants us to look at the information we can derive from those pixels. They are making a big bet on imagery-derived analytics and it is absolutely what they should be doing. By leveraging their present advantage of a longer timespan image catalog, they are providing the tools for developers to build knowledge products. They are building the future of remote sensing, which will be less about a particular image or sensor, but instead be more about locations and the stack of information holding pixels available for that place.&nbsp;</p>
<p>Clearly Planet are moving towards a platform approach too, which the proposed daily delivery schedule, there is no real option. But DGs evident interest and investment in this area to date indicates an overt interest in engaging with non-traditional remote sensing customers.</p>
<p>Evidently disruption is happening all around us and continues to be a pervasive force in technology. At sparkgeo we work with all the above companies and wish them all very well. We have always thought of ourselves as a geospatial Switzerland. This position allows us to provide an honest level of critical observation of our industry.&nbsp; We are deeply motivated by excellence in geospatial and as a (vocal) part of the geospatial community, feel very lucky to have such a wealth of innovation happening around us.</p>
<p>Hooray for you!</p>Disruption in the Geospatial Age (pt1)2017-03-08T23:51:54+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/disruption-in-the-geospatial-age-pt1/<p>How grand I am to throw around terms like 'the geospatial age' as if I&rsquo;m some kind of luminary. It sounds wonderful that we might be in some kind of 'age', but really, what on Earth does that mean? Well, what I mean is we as a technology community we are closer to geospatial technology than we have ever been. Indeed, billions of us literally hold it in our hands much of our waking lives. As such, we are now seeing an enormous development in the expectation and exploitation of geography as a toolkit. Sometimes this is wonderful to witness and sometimes terrible, but nevertheless we see maps and coordinates everywhere powering everything from restaurant reviews to rocket ships. Yes, our age is overtly geospatial.</p>
<p>Disruption when regarded within the classic <a href="https://hbr.org/2015/12/what-is-disruptive-innovation">Harvard Business Review purview</a> is when a smaller company out competes a bigger market incumbent by introducing a lower cost product into a previously underserved market. Using that toe hold to innovate and gradually push the incumbent up the margin triangle and forcing them to serve only progressively &ldquo;higher end&rdquo; customers. Thus, their growth slows and eventually they are replaced entirely.&nbsp;</p>
<p>Typically, disruption is hard to respond to. Initially that underserved market is seen as low margin and indeed uninteresting to the incumbent. In some cases businesses are even happy to give away that segment of the market because of the low profits involved. But over time as the disruptive force grows in market adoption and indeed competence, their initially low cost product becomes more generally attractive and they can become dominant. Leaving the previous incumbent to wonder what they did wrong.</p>
<p>There is a difference between disruption and sustaining innovation. For instance, <a href="/admin/blog/blogpost/58/uber.com">Uber</a> is a sustaining,( potentially revolutionary) innovation of the traditional taxi market, not so much of a <em>disruption</em>. You still have a driver in a vehicle, they are just being hailed differently with better technology. The experience is arguably better, but ultimately Uber is a similar product offering at a comparable price. However, Uber is a potentially very disruptive force for the automotive industry, providing a real and much lower cost alternative to vehicle ownership.</p>
<p>This pattern has been witnessed <a href="https://techcrunch.com/2013/02/16/the-truth-about-disruption/">numerous times in Silicon Valley</a> <a href="https://en.wikipedia.org/wiki/Disruptive_innovation">and beyond</a>. But, I want to specifically look at two active examples of seemingly disruptive forces in our geospatial world. I should also point out that I am not party to any of these organizations&rsquo; strategies.&nbsp; If we were, we would not be writing about them here. What is contained in this article is based entirely on a close observation of our geospatial age.</p>
<p><strong>ESRI and Boundless</strong></p>
<p><a href="/admin/blog/blogpost/58/esri.com">ESRI</a> has owned desktop and enterprise geospatial for literally decades. Indeed, even within the last few weeks they have been winning prizes for innovation from fast company magazine (*ref). This truly amazing company has been building geospatial technology for 40 years and in many people&rsquo;s minds have defined the term GIS (*).</p>
<p>ESRI&rsquo;s technology is typically seen as expensive. It is a thorough analytic suite and extensive in its reach but that comes at a price. Luckily, most of our government agencies have been happy to swallow the expense to enjoy their robust geospatial product offering. Interestingly however, whilst dominating in government, agency and large enterprise environments, ESRI has had difficulty breaking into the technology market. &nbsp;Now, technology has always been an &ldquo;odd duck&rdquo; when compared to more structured organizations like state and federal government agencies, but with that oddness comes a rich pay off. (<a href="https://www.bcg.com/documents/file109372.pdf">$1.6 Trillion in the US if you believe the Boston Consulting Group</a>,&nbsp;and that was written before Automotive fell in love with connected vehicles and maps)</p>
<p>ESRI is a private company, so we can only really guess, but anecdotally with annual revenues of over $900 million, they have been enormously successful. But, given their difficulty in accessing the technology market and the manner in which the technology market is seeping into most other markets, growth will become (or already is) a problem for ESRI. There are simply a finite number of agencies in our world.</p>
<p>Google did make a foray into the geospatial world with their MapsEngine and EarthEngine products as extension to the widely-adopted Google Maps product. However, with MapsEngine&rsquo;s depreciation and EarthEngine&rsquo;s relegation to more academic environments it seems as if Google is stepping away from any sort of enterprise geospatial offering. This is underlined by Google recent offloading of the TerraBella imaging small-sat constellation to Planet. In many ways, it seems as if ESRI won this tussle. But here lies the rub, consumer geospatial has emerged as an overtly different market from enterprise geospatial. So, in terms of Google&rsquo;s exit, were they pushed, or did they jump?</p>
<p>A consistent bug-bear for ESRI has been the open source geospatial community. A community of technologists willing to build and share technology for free is a frustrating entity for a proprietary software company to deal with. This community has and continues to be vocal, but the barriers to an individual&rsquo;s entry have been high. In reality, up until quite recently if one had wanted an open source alternative to ESRI one would be left cobbling together a disparate series of tools and libraries and likely writing some glue code. In reality one would have to be a competent computer scientist. However, that pendulum has begun to swing back.</p>
<p>Open source tools themselves have always been competent in and of themselves and some have even become the reference technology standard. But typically, they have been beset with poor user interfaces, poor cohesion and lack of a holistic message. This has been problematic for ESRI, but seemingly far from disastrous, given the need large organizations and enterprises have for single source software solutions and enterprise support.</p>
<p>Indeed, perhaps here is the reason for ESRI&rsquo;s lack of penetration into the technology space as yet. With the technology sector&rsquo;s desire to own intellectual property and willingness to build, manage and further invest in their own technical differentiators, the idea of buying licenses for software that is critical to a funded startup&rsquo;s valuation seems unlikely.</p>
<p><a href="/admin/blog/blogpost/58/boundlessgeo.com">Boundless</a> geospatial has been around in various guises for some time. From Open plans to Opengeo to now Boundless this organization has been trying to make money from open source geospatial technology since the early 2000&rsquo;s. This is not an obvious path and indeed in many ways led them to being a largely services driven company for some time. Those services have typically included the implementation of alternative open source geospatial solutions (alternative to ESRI, that is). Of late however, Boundless has been focusing more on product, and that product comes in a combination of community and support.</p>
<p>Open source geospatial tools are a much lower cost product offering than ESRI&rsquo;s, and with Boundless&rsquo; willingness to provide an optional, robust enterprise support package on top of a curated selection of those open source tools, we start to see the inception of a disruptive product offering.</p>
<p>This offering is attractive to the tech sector because with many open source tools, if the licensing is satisfactory then the technology company in question can make alterations to the code base of the tool. In doing so they will retain internal IP, avoid paying future license fees or risk core technology changes. Indeed, they will also be largely immune to having to dal with their systems being dependent on a third-party technology or a complex pricing strategy. All with the addition of a person to call or email in the event of a question.</p>
<p>Technology in our case is the largely underserved market, but the prize for Boundless is enterprise geospatial.&nbsp;</p>
<p>For these reasons ESRI must be looking at companies like Boundless and indeed Carto with a wary eye and considering how they can serve these underserved markets better, or risk wholesale disruption.</p>
<p>The options open to ESRI are numerous but a key entry point must be the technology sector, where they have been losing mindshare. The likely best option here is in the provisioning of their ArcGIS Online product as a more basic Platform As A Service (PAAS), which could be an elastic geospatial web platform. This idea of <a href="https://blogs.esri.com/esri/esri-insider/2015/12/08/localization-and-location-as-a-service-laas/">Location As A Service has been posited</a> before from within ESRI's own ranks. This would reduce the barrier to entry significantly. Few companies complain about the equivalent (non-geospatial) Amazon Web Services owning IP around elastic infrastructure. Instead, AWS provides a scalability safety net and the opportunity to grow with the technology company.</p>
<p>Some might argue that the construct of the &ldquo;ESRI credit&rdquo; in lieu of dollar pricing is a confusing barrier, somewhat equivalent to a casino issuing chips rather than using cash at their tables. But beyond that implementation detail, making geospatial web infrastructure easy seems like a solid strategy. Indeed, ArcGIS Online can largely provide this service now, but pricing, monitoring and the messaging are not startup friendly yet.</p>
<p>Interestingly, this product is akin to what <a href="/admin/blog/blogpost/58/mapbox.com">Mapbox</a>, has been building for the last 3 years, indeed it's also what <a href="https://www.crunchbase.com/organization/simplegeo#/entity">simpleGeo</a> tried to do ages ago!</p>
<p>Please continue onto <a href="/blog/disruption-in-the-geospatial-age-pt2/">part 2</a>&nbsp;where we take a little look at the remote sening industry.</p>Woods Hole Research Center &amp; the Carbon Source2017-01-05T22:23:58+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/woods-hole-research-center-the-carbon-source/<p dir="ltr"><img style="margin: 5px;" src="/static/media/uploads/website-WHRC-logo-with-rank.png" alt="" width="348" height="71" /></p>
<p dir="ltr"><strong>WOODS HOLE RESEARCH CENTER &amp; SPARKGEO&nbsp;</strong><strong>COLLABORATION ON THE CARBON SOURCE</strong></p>
<p dir="ltr">The Sparkgeo team received an early Christmas gift. We are ludicrously excited to be working with the <a href="http://whrc.org/">Woods Hole Research Center</a> on the creation of their Carbon Source experience.</p>
<p dir="ltr">At Sparkgeo we try to spend a good proportion of our time working with nonprofits. Organisations like the Woods Hole Research Center (WHRC) are genuinely trying to save the world, working relentlessly to prevent large scale carbon emissions. We are flattered and humbled to help them with their mission to mitigate severe climate change in our lifetime.</p>
<p dir="ltr">We will be working on a new project, which for now is called &ldquo;The Carbon Source.&rdquo; The project will endeavour to get WHRC&rsquo;s renowned climate science into the hands of those making decisions about our collective futures from national leaders, to local land managers to households. We all need to take ownership over our world, and what better way than getting a better picture of what the science is telling us.</p>
<p dir="ltr">Sparkgeo will be building map based visualizations to help the WHRC tell their stories of the Arctic, the Amazon, forest restoration and emissions across the world. In addition, we will be sharing deeper descriptions of the research being undertaken and providing access to raw data products for professional and citizen scientists to study further. The ability for our web map analytics tool, Maptiks, to measure user engagement on these web maps and ultimately provide quantitative visibility into the success of the project was a key factor in our selection.</p>
<p dir="ltr">Again, we are very excited to be working with the world class Woods Hole Research Center team and we look forward to keeping our followers up to date on our progress.</p>
<p dir="ltr">&nbsp;<img src="/static/media/uploads/IMG_5727.png" alt="" width="600" height="426" /></p>
<p dir="ltr">Trying to do our bit to keep our world beautiful!</p>Facebook, Here and another nail in Bing Maps&#39; coffin?2016-08-18T19:52:13+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/facebook-here-and-another-nail-in-bing-maps-coffin/<p>I noticed a small change in the maps provided by facebook today. In the bottom left corner they say <a href="https://company.here.com/here/">Here</a>, not <a href="https://www.microsoft.com/maps/v8control.aspx">Bing</a>. <a href="https://www.facebook.com/sparkgeo/about/?tab=overview">Go take a look</a>.</p>
<p><img src="/static/media/uploads/facebook_here.jpg" alt="" width="600" height="406" /></p>
<p>This is interesting for a few reasons. Firstly, we should remember that <a href="http://mashable.com/2015/06/29/uber-bing-maps/#rF4Iy3kBKZqQ">Uber bought a significant amount of the Bing Maps team&nbsp;</a>&nbsp;about a year ago. After that, it seemed that the biggest 'thing' that Bing had going for it was its linkage to Facebook. In fact, there has been a fair bit of <a href="http://www.adweek.com/socialtimes/facebooks-places-location-service-is-using-maps-from-bing/245458">press</a> <a href="http://thenextweb.com/socialmedia/2010/08/19/facebook-solidifies-its-microsoft-partnership-bing-maps-being-used-in-places-app/#gref">on</a> <a href="http://www.zdnet.com/article/microsoft-facebook-are-collaboration-pals-will-the-new-bing-deliver-market-share-returns/">why</a>&nbsp;Facebook was still using <a href="https://www.quora.com/Why-does-Facebook-use-Bing-maps-instead-of-Google-maps">Bing over Google</a>, or any other service.</p>
<p>It was pointed out to me by&nbsp;<a class="account-group js-account-group js-action-profile js-user-profile-link js-nav" href="https://twitter.com/muthukumarceg" data-user-id="1140206832">Muthukumar Kumar</a>&nbsp;that Facebook signed the initial <a href="http://www.theverge.com/2015/5/4/8548399/nokia-here-maps-shows-up-on-facebook-instagram-messenger">Here deal in 2015</a>, but I don't think the move on the web has been live for as long as that (or I am particulalry unobservant). Indeed, it seems that the <a href="https://techcrunch.com/2015/05/04/facebook-and-nokia-quietly-ink-deal-for-here-to-power-maps-on-mobile-instagram-and-messenger/">initial deal was on secondary apps like messenger</a>&nbsp;etc. This move to the full platform obviously indicates a level of satisfaction with the flexibility and performance of the Here service;</p>
<p>One can only imagine that the dollar value of this deal is significant. For Here this is obviously an amazing opportunity, likely accompanied by a deep sigh of relief. Indeed, since <a href="http://fortune.com/2015/07/21/nokia-maps-here-german-cars/">Nokia sold them</a> <a href="http://www.forbes.com/sites/sarwantsingh/2015/08/05/here-acquisition-by-the-germans-opens-innovation-on-the-cards/#536f60844210">to a consortium of German companies</a>, Audi, BMW and Daimler, in December 2015 it seems like the troubled company has been executing very well in the automative / automated vehicle space. Evidently, they are also showing expertise in the general basemap dissemination business too. I have heard it said, by those with more experience than me, that "selling map tiles is a race to the bottom", but it's clear that everybody in the web mapping game would like to have their sticker on Facebook's maps. Additionally, this indicates Here's capacity to provide enormous scale, which will be necessary when delivering maps to a multiplicity of vehicles.&nbsp;</p>
<p>On the other hand, even though the Bing Maps platform has announced <a href="http://blogs.bing.com/maps/August-2016/Bing-Maps-V8-July-2016-Update">a recent update</a>,&nbsp;this deal feels like <a href="https://www.microsoft.com/mappoint/en-us/home.aspx">one more nail</a> in the coffin of Microsoft's mapping services.&nbsp;</p>
<p><strong>EDIT (august 19th, 2016)</strong> I was contacted by <a href="http://www.twitter.com/rbrundritt">Ricky Brundritt</a>&nbsp;(Senior Program Manager on the Bing Maps customer advisory team) and he was able to provide some Microsoft perspective. Firstly, he noted that in terms of the Facebook / Here deal, Bing already uses Here as a source to their mapping fabric, so the choice likley had little to do with technical capability.</p>
<p>In terms of the Uber deal Ricky said that the staff that moved were previously focused on the collecton and processing of aerial imagery. Bing Maps has always licenced imagery from other data providers so this line of business is unaffected and this change has allowed Bing to focus on the delivery of enterprise spatial intelligence services.</p>
<p>Bing Maps' focus on spatial intelligence is intriguing, certainly to date hard spatial intelligence has been the domain of specialist map developers and libraries like <a href="http://turfjs.org/">turf.js</a>,&nbsp;baking these analyis tools <a href="http://www.bing.com/api/maps/sdk/mapcontrol/isdk#binaryOperations+JS">directly into a map library</a> does serve to differentiate Bing. That said, I am sure Mapbox's response would be their <a href="https://www.mapbox.com/blog/series-b/">lego block philosophy</a>.</p>
<p>Interesting times indeed.</p>Git for the world2016-07-28T19:38:58+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/git-for-the-world/<p>I had another good long discussion with&nbsp;<a href="https://twitter.com/bmann">@bmann</a>&nbsp;this morning. We talked about automated vehicles, augmented reality and the implications on the gesopatial industry. Which, FYI, will be profound.</p>
<p>A few minutes later he tweets this:</p>
<p><a href="http://twitter.com/bmann/status/758724373549199360" target="_blank"><img src="/static/media/uploads/boris_geogit.png" alt="" width="504" height="160" /></a></p>
<p>That guy... Anyway.</p>
<p><a href="https://github.com/locationtech/geogig">Geo Git</a> is far from being a "new thing". The concept being that of applying the <a href="https://en.wikipedia.org/wiki/Git_(software)">git code versioning process</a> to geospatial data has been realised as&nbsp;<a href="http://geogig.org/">Geo Gig</a>&nbsp;and been sheperded by <a href="http://boundlessgeo.com/whitepaper/new-approach-working-geospatial-data-part-1/">Boundless</a>&nbsp;to the point it is at now. I am not going to comment on the state of the project. I honestly don't know. But I will comment on the concept. This idea will become a key supporting structure of our future augmented world.</p>
<p>Take a moment and consider. After Pokemon, there will be more augmented applications, connected vehicles are a great example. There have been <a href="http://www.newsday.com/long-island/suffolk/pokemon-go-poses-tragic-real-world-risks-officials-warn-1.12036071" target="_blank">tragic Pokemon events</a>, but considering the <a href="http://mashable.com/2016/07/25/pokemon-go-daily-active-users-slipping/#V.2l7La6P8qF">number of users</a>&nbsp;(peaking at 25 million daily active users), the number of problems appears to be small. Obviously every tragic event is hugely distressing to all concerned, but imagine if the augmented application was driving your vehicle at 100kph (yeah, that's in Canadian). Lots could go wrong, much faster.</p>
<p>Augmented experiences need to be based on real-life data. They need to know where the streets are, where the medians are, where the stop lines are. They will need to know a great deal more than that. Now, of course the car is also a sensor, but the base data needs to also exist pre live capture; in the car example, we have stopping distances to consider. Detecting a stop line 10 meters ahead when travelling at 100kph is a bad scene. Data points like that need to be known in advance of the first vehicle detecting the stop line. I have talked about <a href="/blog/pokemon-automated-cars-augmented-reality-and-the-rise-of-the-geo-boffins/" target="_blank">data's influence on AR recently</a>, here I want to consider further the temporal nature of geospatial.&nbsp;</p>
<p>Data, my good friends, does not remain stationary. Like the world around us, data changes. It's a simple thing to consider: data is a representation of our habitat, our habitat is always changing so data representing it should always change too.</p>
<p>Conceptually the solution might also appear simple. We just apply any changes to our base datasets as we go. Hell we could even pull in new snapshots of a data product.&nbsp;Well yes, and no. The thing about data is that it has personality;&nbsp;<a href="https://www.linkedin.com/pulse/data-opinionated-will-cadell?trk=pulse_spock-articles" target="_blank">Open data in particular is opinionated</a>&nbsp;and <a href="/blog/disparate-data-technology-fiefdoms-and-65-pictures-of-your-cat/" target="_blank">messy</a>;&nbsp;data is very human, somewhat reflective of the organisation that created it. This means that we are left with a literal Tower of Babel. Not only are there many different shapes and sizes of data products, but each product can also be of a significant size and somewhat dirty in and of itself. Indeed, we now have alternative sources like hyper-temporal remote sensing to further support this live firehose of change.</p>
<p>As such, geospatial data management goes from being expert to insane mode very quickly. This is precisely the reason that Google Maps etc will never be done and that <a href="https://newsroom.uber.com/mapping-ubers-future/" target="_blank">Uber continues to invest in geospatial</a>.</p>
<p>We need to be looking hard at technologies like geo-gig again. If we can apply a&nbsp;<a href="https://en.wikipedia.org/wiki/Git_(software)" target="_blank">git style</a>&nbsp;methodlogy to our geospatial data we will be very much closer to some kind of reasonable, live, vector mapping fabric. Again, the stickler will be adoption. Custodians of those data products which are becoming so valuable, will themselves have to see value in the adoption of a common schema and a common API.</p>
<p>A common schema because generalisation must and will happen. A road in one US county is much the same as a road in another US county except, if one was to just look at the data, they appear to be quite different.</p>
<p>An API because data updates need to trigger external events. Unless we are to see a literal doubling of web traffic caused by bots testing for new public data products, the custodians of those modern open data products will need to have an API of change. A trigger indicating to the community that there is new or changed data to harvest. We absolutely need to move away from the idea of "quarterly releases" and towards the idea of a persistantly live fabric where small updates might be published daily or hourly. &nbsp;</p>
<p>So, our augmented future will change how we look at open geospatial data. Indeed, we will have to start thinking very hard about how temporal data management can be reasonably achieved.</p>
<p>Before you say it can't be done, pause. You know it will be, it's just a matter of time. Everything is impossible, right up to the day someone does it. &nbsp;</p>Pokemon, Automated Cars, Augmented Reality and the Rise of the Geo Boffins2016-07-14T20:18:02+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/pokemon-automated-cars-augmented-reality-and-the-rise-of-the-geo-boffins/<p>Firstly, I blame <a href="https://twitter.com/bmann">@bmann</a> for this post:</p>
<p><img src="/static/media/uploads/boris.png" alt="" width="400" height="586" /></p>
<p>The crux is simple.&nbsp;</p>
<p>GIS was the domain of map <a href="https://en.wikipedia.org/wiki/Boffin">boffins</a>, then we got smart phones, and it became cool. Then we got open data, and suddenly GIS became a context for the rest of our augmented existence. (Hello <a href="http://www.7wdata.be/futurism/new-artificial-intelligence-beats-tactical-experts-in-combat-simulation/">SkyNet</a>, goodbye world, the end).&nbsp;</p>
<h2>Consumer Geospatial</h2>
<p>Geospatial technology through the early 2000s had become a backroom activity. Web forums were beset with questions like &ldquo;<a href="https://mangomap.com/what-is-gis" target="_blank">What is GIS?</a>&rdquo; and &ldquo;<a href="http://gis.stackexchange.com/questions/653/how-do-i-explain-what-gis-is-for-the-11-year-old-kid">How</a> <a href="https://blogs.esri.com/esri/esritrainingmatters/2013/04/30/the-what-is-gis-challenge/">to</a> <a href="http://gis.stackexchange.com/questions/21459/how-do-you-explain-what-gis-is">describe</a> <a href="https://www.reddit.com/r/gis/comments/19fy7g/rgis_how_do_you_explain_your_job_to_someone_who/">GIS</a> <a href="https://justthegistofit.wordpress.com/2009/08/09/how-to-explain-gis-to-your-grandmother/">at</a> <a href="https://www.quora.com/How-do-you-explain-GIS-to-someone-who-has-never-heard-of-it">dinner</a> <a href="https://www.gislounge.com/what-is-gis/">parties</a>&rdquo;. People who did GIS knew it was cool and sometimes really hard, but the industry itself didn&rsquo;t know what it was, or what it stood for.</p>
<p>In reality there was very little &ldquo;consumer geospatial&rdquo; beyond <a href="https://www.mapquest.com">MapQuest</a> and road maps. Then came <a href="http://maps.google.com">Google Maps</a>, and we all know that story. With the technology enablers of better browsers, better compute on the web, and better devices we have seen the rise of consumer geospatial applications. Apps like <a href="http://life360.com">Life360</a>, <a href="http://www.nextdoor.com">Nextdoor</a> and <a href="http://www.yelp.com">Yelp</a> have brought maps and mapping tech into the hands of everybody else. I use these apps, because they are overtly <em>not</em> navigational, the geospatial parts are intrinsic, but contextual. That&rsquo;s important. These are not &lsquo;maps&rsquo; but purposeful experiences using high quality maps to support their business goals. <a href="http://uber.com">Uber</a> would be another great example; we&rsquo;ll come back to them.</p>
<p>In concert with this rise in geospatial, we have seen an increasing appetite for openness and transparency in government. This has resulted in a great deal of data available to the technology-aware amongst us. You have to understand though, the data is open, but it might not be good.</p>
<p>The greatest misnomer in technology is that data will solve everything. Especially when we mix in a bit of machine learning magic. What we rarely talk about is that often data is wrong, dirty, or at the very least <a href="/blog/data-is-opinionated/">opinionated</a>.</p>
<p>Now, the furious tempest that is the mid 2010s technology scene has jumped on these two concepts and has been jamming them together resulting in numerous different techno-creations. This digital love-in has resulted in the job title &ldquo;data scientist&rdquo;, some amazing visualization work, but more so a good understanding of how hard and big &lsquo;data&rsquo; is.&nbsp;</p>
<h2>Augmented Reality</h2>
<p>Now, I&rsquo;m getting to the point, Pok&eacute;mon Go (I know, what a band-wagoner, again it&rsquo;s <a href="https://twitter.com/bmann">@bmann</a>&rsquo;s fault).</p>
<p>Pok&eacute;mon Go uses a map, and somewhat aging augmented reality techniques extensively. It is an enormously awesome example of what consumer geospatial can deliver. The map is there, it&rsquo;s essential, but it is really only context to the activity.</p>
<p>You know what else fits that description? Your connected car next year, and your autonomous car in five years. Better than that, we will have persistent augmented experiences. When I plan a route on my computer it will be delivered to my car and mobile device because navigation doesn&rsquo;t stop at the edges of one particular device&rsquo;s abilities, but when I reach my destination.&nbsp;</p>
<p>Apple made a small announcement at <a href="http://mobilesyrup.com/2016/06/13/apple-announces-maps-is-now-open-to-developers-at-wwdc-2016/">WWDC 2016, that their Maps API</a> would be more available to developers. A small announcement, a big network effect: persistent augmented reality.</p>
<p>Your augmented future will not be linked to any particular device but instead all your devices. Siri or whomever will be a consistent companion and she will point you at all the Pok&eacute;mon you could ever want. But given our data quality issues, she might be wrong, or there&nbsp;<a href="http://www.cbc.ca/news/canada/british-columbia/google-maps-glitch-park-fort-george-provincial-park-trailhead-1.3623990">might be a river in the way</a>.</p>
<p>For us geospatial people, this will likely be a large piece of our reality. Because every piece of what I just talked about involves a ludicrous amount of geospatial effort.&nbsp;</p>
<h2>The Tower of Open Babel</h2>
<p>Firstly, open data. It is open, but it is not free. The cost comes in the form of effort in using it. Organizations across our blue Earth publish data. For simplicity, let us only consider road data. None of these published road networks are &ldquo;good&rdquo;, mainly because there is rarely a definition of &ldquo;goodness&rdquo;. Additionally, there are always exceptions to the case, which allow &ldquo;non-good&rdquo; roads to slip though. When I talk about <a href="/blog/data-is-opinionated/">data having human characteristics</a>, this is what I mean. Even better, one organization publishing road data does not do it in the same manner as their neighboring organizations. Not only do we not know what &ldquo;good&rdquo; is locally, but also we have no idea what it is globally either. So we are multiplying our efforts in a non-linear manner by the number of organizations we need to engage to aggregate the data we need.</p>
<p>It would not be unreasonable to compare open data to the<a href="https://en.wikipedia.org/wiki/Tower_of_Babel"> Tower of Babel</a>. We&rsquo;ve gotten a little too close to some kind of data utopia and found that a higher power has seen fit to ensure we all speak different languages. Indeed different formats, encoding, attributes, spatial reference systems&hellip; Thwarting our efforts to reach our own knowledge singularity.</p>
<p>It is left to parties who care about putting data together globally to determine how to account for &lsquo;goodness&rsquo; and manipulate each organization&rsquo;s data accordingly. So data from each organization gets manipulated into a common form for ingestion.</p>
<p>But hold on, data also changes. There is 4<sup>th</sup> dimension on our Earth and we need to take that into account. Change happens over time, so unless we want our autonomous vehicles to drive into that new median, we can&rsquo;t miss the road updates. We better test for and consume all changes regularly.&nbsp;</p>
<p>So far the problem for roads is&nbsp;</p>
<pre>org&rsquo;s road data * every org publishing roads * time == a ludicrous amount of geospatial work&nbsp;</pre>
<p>That is just for roads, the list of other sources of interest is extensive. &nbsp;Don&rsquo;t forget also that data is opinionated so we might want more than one source for any particular place. Google has been doing this for a decade, Apple for years now, Uber is now in on the action too, and none admit to being close to finishing the job, because like the rest of the web, geospatial will never be "done".</p>
<p>Uber is particulalry interesting given that, at first blanche their map is a supportive technology. Indeed the consumer experience is still largely Google Maps. Yet, they <a href="http://www.forbes.com/sites/amitchowdhry/2015/03/04/uber-acquires-decarta/#7bbfa5574de5">consume</a> <a href="https://www.uber.com/careers/list/?keywords=maps">geospatial</a> <a href="http://www.theverge.com/2015/6/29/8863687/uber-acquires-mapping-data-tech-and-talent-from-microsoft-bing">talent</a> at a startling pace. Evidently, to Uber understanding the Earth's transportation networks (and perhaps more) is hugely important. Presently they augment their drivers' realities, in the future one would expect them to augment frankly most of our connected society's.</p>
<p>When I look at Pok&eacute;mon I see a few things. Firstly I see what Augmented Reality has always been promising, wrapped up in a fun game. Secondly, I see a growth in consumer geospatial. Thirdly, I see a public that is unwilling to accept cartographic / data errors inside their consumer geospatial experiences. Finally, I see a bunch of geo boffins saying they make apps like Pok&eacute;mon Go at dinner parties. Finally their friends will get it, and it will be cool.&nbsp;</p>Data is opinionated2016-07-13T23:24:29+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/data-is-opinionated/<p>Geospatial data in particular is opinionated. Now, opinion isn&rsquo;t always a great thing. Opinion doesn&rsquo;t mean you know better, it just means you have a particular view of the world and feel comfortable in maintaining that perspective.</p>
<p>We live in an age where strong opinion seems to be celebrated; whether that opinion is founded in fact or fiction is largely irrelevant. Perhaps, this context has led us to a place where we are comfortable just seeking single opinions on a subject. Obviously as geospatial professionals this habit is dangerous.</p>
<p>My initial assertion that data has opinion is an extension of the idea that data comes from somewhere. Whether that is an imagery source or the crowd or a survey. Typically there is a single source and we stick to it. We loosely consider &lsquo;error&rsquo;, but do we consider opinion?</p>
<p>Opinion needn&rsquo;t be erroneous, but instead it indicates a perspective from which a particular data point is created. The danger of perspective is that not every viewpoint will capture the entire reality.</p>
<p><strong>Image Opinion</strong></p>
<p>In terms of imagery, we could consider that the spectral, spatial and temporal resolutions of a particular sensor as facets of its opinion. That sensor will be limited in its perspective of the Earth by those features of its design. In reality should we, as pursuers of the Earth&rsquo;s representation, only consider one opinion of our Earth? Is the opinion we choose to listen to always correct? is it the&nbsp;<em>best</em>&nbsp;opinion? Are we reaffirming our own confirmation bias by not looking beyond the specifications of our chosen image source?</p>
<p>In practical terms, I am talking about our ability to leverage multi-sensor remote sensing. Multi-sensor anything is hard, really hard. We have to consider different orbits &amp; altitudes, different times of the day, different atmospheric correction, different resolutions. But, in those problems are&nbsp;the opportunity. More opinions lead to a better understanding of an&nbsp;issue. Additionally,&nbsp;<a href="/blog/the-great-space-race-pt1/" rel="nofollow" target="_blank">we have constellations and constellations</a>&nbsp;of interesting Earth observation satellites being launched. We now have the ability to draw real imagery conclusions about things happening today from the clouds of opinion in our orbit. We just need to start doing it.</p>
<p><strong>Vector Opinion</strong></p>
<p>Imagery is largely a matter of the interpretation of differing perspectives. With vector data, opinion is even more pertinent. The interesting thing about vector data is its far greater ability to be just plain&nbsp;<em>wrong</em>. Consider the humble road network. Is one source of this data ever truly correct? Are we willing to bet our autonomous vehicle future on a single source of road data?</p>
<p>No, that would be crazy talk.</p>
<p>Sure, who wants lots of different lines telling us the same thing? Our software friends will throw up theirs arms and say "redundancy!" But, what if we don&rsquo;t find that the data agrees?&nbsp; Who is right? What is rightness? This speaks to the&nbsp;<a href="/blog/disparate-data-technology-fiefdoms-and-65-pictures-of-your-cat/" rel="nofollow" target="_blank">Tower of Open Babel</a>&nbsp;we data people have created. I warn our&nbsp;clients that&nbsp;when we are considering vector sources we should think about those sources&nbsp;as extensions of the characters who created them. The cast of characters and their view of the world invariably has a huge impact on what that&nbsp;data source is capable of telling us.</p>
<p>Indeed, opinion in vector data is so marked we should always be wary of any analysis comprising single sources of data on a particular subject.</p>
<p>I implore you, observe how opinionated your data is, and consider seeking a 2nd opinion.</p>Apple Maps API2016-04-20T20:13:41+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/apple-maps-api/<p>How Exciting!</p>
<p>I'm not sure if this was a deliberate Easter Egg in Apple's <a href="https://developer.apple.com/wwdc/attending/" target="_blank">WWDC</a> website, I have a sneaking suspicion it might be, but we've just had a glimpse of an Apple Maps web API. Traditionally, Apple Maps have been available only in native iOS apps. It seems that might be about to change. As spotted by <a href="http://fruitymaps.com/" target="_blank">Tim Broddin</a>&nbsp;and blogged about by <a href="http://googlemapsmania.blogspot.ca/" target="_blank">Google Maps Mania</a>, there is now a version of Apple Maps available for browsers. Here is my first, rather poor attempt at an Apple Map:</p>
<p><a href="http://snippets.sparkgeo.com/applemaps/"><img style="vertical-align: middle;" src="/static/media/uploads/applemap.png" alt="" width="600" height="586" /></a></p>
<p>The API seems much like other javascript mapping APIs, except with a few interesting in-built features, such as geolocation. The cartography feels very like the AppleMaps product but the main mapping fabric still seems to be largely tile based, though I wonder if there is some "vector-tile" like behaviour hiding in there. In general the API is intuative enough to make the basics happen without any docs at all. Hopefully this provides some insight as to what to expect at the WWDC for mapping.</p>
<p>Or maybe the <a href="/">Sparkgeo</a> map geeks are just getting all excited over nothing!</p>
<p>&nbsp;</p>Maptiks Map Analytics Demo2016-04-12T18:57:13+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/maptiks-map-analytics-demo/<p>As we all know, a demo is worth at least a thousand words. earlier this week I put together this client side demo to help our clients get the feel of the kind of things we can do with <a href="https://maptiks.com">Maptiks</a>. I wanted to share it with all our Sparkgeo friends too. Welcome to the future of web maps.</p>
<p><iframe src="https://snippets.sparkgeo.com/maptiks/" frameborder="0" width="100%" height="1250"></iframe></p>
<p>Of course, Maptiks is doing a &nbsp;great deal more than just this, but I hope to get your creative juices flowing with this one quick demo. Have fun!</p>The great space race pt.22016-03-29T22:08:09+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/the-great-space-race-pt2/<h1>Pt.2 Image is not everything</h1>
<p>This is the 2nd part in a series on the commercial satellite imagery sector. It's a wild ride, <a href="/blog/the-great-space-race-pt1/">but you can catch part 1 here</a>.</p>
<p>Some might argue that no one ever wanted the image; it is more of a by-product, a means to an end. Imagery consumers want to know something about the image, perhaps a piece of data the image contains. That might be a discernable change or it might be the number of cars in a parking lot, but once that data is extracted the image becomes redundant.</p>
<p>Additionally if you are collecting regular images then frequently downloading 200GB files becomes onerous, and storing them is a painful cost.</p>
<p>Imagery As A Service is becoming a key piece of every satellite provider. We are well beyond the simple idea of downloading an image from an FTP site. <a href="https://www.digitalglobe.com" target="_blank">DigitalGlobe</a> have been building out their Geospatial Big Data platform &ndash; GBDX for the last 2 years, in an attempt to make the interrogation of their enormous back catalog more reasonable. <a href="http://www.astrodigital.com">Astro Digital</a> made their announcement recently about <a href="http://techcrunch.com/2016/02/24/astro-digital-releases-platform-for-anyone-to-analyze-satellite-imagery/%20" target="_blank">a new image API</a>; <a href="https://www.urthecast.com" target="_blank">Urthecast</a>&nbsp;and <a href="https://www.planet.com/" target="_blank">Planet Labs</a> also provide data almost exclusively via image services.</p>
<p>This recent move towards a platform approach to remote sensing products provides opportunity for diversification beyond the present addiction to government contracts. In fact it is clear that the consumer remote sensing product nut is far from cracked, by anyone.</p>
<h1>Industry Performance</h1>
<p>It is not just DigitalGlobe and Planet Labs in this sector. The commercial satellite space has a number of other very interesting companies emerging. One of those is Urthecast, the Canadian player in the game that went almost straight to the Toronto Stock Exchange.&nbsp; Other players include Astro Digital, <a href="http://www.blacksky.com/" target="_blank">Black Sky</a> and <a href="http://www.herasys.com/" target="_blank">Hera Systems</a>. These are just the private sector companies and don&rsquo;t include the numerous national satellite programs which are currently orbiting our planet (<a href="http://landsat.gsfc.nasa.gov/" target="_blank">Landsat</a>, <a href="http://www.esa.int/Our_Activities/Observing_the_Earth/Copernicus/Overview4" target="_blank">Sentinel</a>, <a href="http://modis.gsfc.nasa.gov/" target="_blank">MODIS</a>, <a href="http://www.asc-csa.gc.ca/eng/satellites/radarsat/" target="_blank">RADARSAT</a>...)</p>
<p>Satellite companies have not escaped the present market hostility, however. In fact publically traded stock prices of DigitalGlobe and Urthecast have declined steadily over the last six months. Obviously there are numerous external reasons to be cited for this decline. A <a href="http://www.fool.com/investing/general/2016/02/26/downgrades-3-reasons-digitalglobe-could-be-a-value.aspx" target="_blank">common thread amongst the analysts</a>&nbsp;is that a heavy reliance on government contracts for revenue is seen as a negative. Though, it&rsquo;s clear that DigitalGlobe is generating significant ($100s of millions) revenue from this sector.</p>
<p><img src="/static/media/uploads/dg_uc_perf.png" alt="" width="600" height="357" />&nbsp;</p>
<p>Indeed this DigitalGlobe's recent announcement of a joint venture with <a href="http://www.taqnia.com/2014/EN/indexen.html" target="_blank">TAQNIA</a> and <a href="http://www.kacst.edu.sa/en/Pages/default.aspx" target="_blank">KACST</a> suggest a further consolidation on the defense / security markets. In review, this deal will provide DigitalGlobe with access to 50% of the image production over the Middle East and 100% of production over the rest of the world, from a constellation of 6 small satellites. Initially, it was suggested that this would give DigitalGlobe access to a potential (and unprecedented) 40 daily repeats of the same location if desired. This would imply a targeting, off-nadir capability in the small satellites, which is unusual given typically small fuel / power storage capabilities for positioning from smaller units.</p>
<p>Apart from adding to the image catalog of DigitalGlobe&rsquo;s Geospatial Big Data efforts, this announcement is a clear message to small satellite companies that they will soon provide high repeat as well as high resolution. Beyond that, they intend to combine these two products in some manner to generate a greater combined value. Initially the announcement suggested that small sat constellations were of &ldquo;limited value&rdquo;. Those references were subsequently removed. Nevertheless this move is a clear shot at Planet Labs' ownership of the "high repeat market".</p>
<p>Urthecast on the other hand has tried hard to grab some consumer market share, <a href="https://blog.urthecast.com/updates/urthecast-and-pepsi-partner-to-create-an-epic-film-adventure-for-the-new-pepsi-challenge/" target="_blank">having worked on the latest Pepsi challenges</a>. Within the last year (2015)&nbsp;<a href="https://blog.urthecast.com/updates/urthecast-to-acquire-deimos-satellites-and-earth-imaging-operations/" target="_blank">Urthecast has also bought teh Spanish comapny Demios Imaging</a> providing them with significant imaging capacity, a deal named by Euroconsult as <a href="https://blog.urthecast.com/updates/urthecasts-acquisition-of-deimos-named-strategic-transaction-of-the-year/" target="_blank">the strategic acquisition of the year</a>. Seemingly, however they are unable to delink their stock price from that of the price of oil, a common feature of Canadian stocks. This has driven their valuations lower than one would think reasonable for a company with so much development and engineering activity. Indeed Urthecast&nbsp;have announced the development of a constellation of 8 &ndash; 16 imagery / SAR satellites. Providing an interesting and somewhat differentiated sensor mix; SAR is complex to interpret, but has the compelling ability to see through cloud.</p>
<p>Industry performance of the other, as yet private, players is less clear to determine. Interestingly it is thought that a key part of the <a href="http://blackbridge.com/planet/" target="_blank">Planet Labs acquisition of BlackBridge</a> was their &ldquo;purchasing&rdquo; access to immediate revenue as well as an active constellation with back catalog. In one acquisition, Planet&rsquo;s business challenge switched from data acquisition to platform development. Creating a revenue channel in the face of a more difficult investment climate in effect deepens their war chest and lengthens their runway to achieve their revenue targets.</p>
<p>It should be noted that with rumored launches Planet is due to achieve their stated mission of a daily repeat of everywhere on Earth this year (2016). This will be a product we have never seen before. High repeat satellite remote sensing products will be completely dependent on the development of new delivery mechanisms and image services. The platform becomes a key part of the product.</p>
<p>Planet, Digital Globe and Urthecast aren&rsquo;t the only companies with planned launches. Astro Digital, Black Sky &amp; Hera Systems are also looking at launches in 2016 &ndash; 2019.</p>
<p>The skies are getting busy.</p>
<h1>The Game Is Afoot</h1>
<p>There is little doubt that the real winner in this space race will be the consumer of image data. With this much image data soon to be available commercially, how will we the consumers be looking at the Earth?</p>
<p>There is little doubt that the Holy Grail for the image acquisition companies is the development of compelling consumer tools. The lack of a &ldquo;killer app&rdquo; for consumer remote sensing is the biggest risk to every satellite company. It is likely that the first company to truly solve this product / business problem will suddenly develop a significant edge.</p>
<p>It is clear that increased satellite remote sensing competition has fuelled the industry&rsquo;s growth. We as the ever-hungry consumers of data will only benefit from this situation.</p>
<p>What is less clear is how the companies delivering an ever-increasing excellence of image product and regularity of image capture will take those products to the consumer. That will surely indicate a step change in the adoption of remote sensing technology.</p>
<p>&nbsp;</p>The great space race pt.12016-03-29T19:35:07+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/the-great-space-race-pt1/<h1>What just happened to remote sensing?</h1>
<p><a href="https://terrabella.google.com/" target="_blank">TerraBella</a> just arrived. Google&rsquo;s rebranded (previously Skybox) remote sensing constellation is still in very active development with planned launches over the next few years. Their focus is on platform and information delivery rather than delivery of imagery. That&rsquo;s amazing, but not because of the enormity of the problem, but because they are only one of a growing number of companies proposing to do much the same thing.</p>
<p>Last week (24th Feb 2016), we had 2 announcements from <a href="https://digitalglobe.com" target="_blank">DigitalGlobe</a> and one from <a href="https://astrodigital.com" target="_blank">Astro Digital</a>. We've&nbsp;been hearing&nbsp;about <a href="http://investor.digitalglobe.com/phoenix.zhtml?c=70788&amp;p=RssLanding&amp;cat=news&amp;id=2141289" target="_blank">new constellations of small satellites</a>, <a href="http://www.digitalglobeblog.com/2016/02/22/helping-facebook-connect-the-world-with-deep-learning/" target="_blank">partnerships with Facebook</a>, <a href="https://medium.com/@astrodigital/creating-fast-map-search-with-vector-tiles-c31e3b58b50f#.j50vu6wiw" target="_blank">new image analysis APIs</a>. What&rsquo;s more, this week's announcements are just the latest in a litany of happening in the satellite remote sensing industry over the past few years.</p>
<p>It is safe to say that until recently the satellite remote sensing industry had been somewhat beaten up.&nbsp;The idea of&nbsp;<em>remote sensing</em>&nbsp;had always promised so much, but classification accuracy, expense and access to timely imagery always ended up letting down the consumer. In essence the use of remote sensing imagery was (and still is, frankly) really hard. The dream of being able to look at anything at any time from space was being proven only ever a dream, or indeed a dystopian nightmare, depending on your point of view.</p>
<p>At that time remote sensing was proven a retrospective, academic activity. It could only ever indicate changes, which had typically&nbsp;happened some time ago. Interpretation and classification were largely human driven tasks. Examples of remote sensing truly making a decisive difference in any private sector application (beyond cartographic map production) were few and far between. &nbsp;</p>
<p>What has happened in the last 10 years then, to drive the recent explosion in activity in satellite remote sensing? Though it may seem like an explosion, nothing ever happens that fast&nbsp;and&nbsp;it wasn't just one thing.&nbsp;</p>
<h1>The Product</h1>
<p>Simply put there are three features of remotely sensed imagery:</p>
<h4>Spectral Resolution.</h4>
<p>Spectral resolution really describes what wavelengths the satellite is capable of receiving. Different wavelengths are good for looking at different things and are therefore useful for different purposes. To confuse matters here, there is a breed of sensors called <a href="https://en.wikipedia.org/wiki/Synthetic_aperture_radar" target="_blank">SAR</a>, an active, RADAR based technology. An <em>active</em> sensor is one which emits its own signal and subsequently interprets the returned signal rather than <em>passive</em> optical sensors which in essence detect light from the sun that has bounced off the Earth. A key advantage of the SAR technology is an ability to see through clouds.</p>
<h4>Spatial Resolution.</h4>
<p>This determines the size of the things you can see from space. This is usually what is meant when &ldquo;resolution&rdquo; or &ldquo;pixel size&rdquo; are talked about. DigitalGlobe has the market-leading platform presently, the WorldView series, which creates simply beautiful 30cm (approx. 1 foot) resolution images.&nbsp; At 30cm you can easily see cars and often discern people, though not necessary what they are doing.&nbsp;</p>
<h4>Temporal Resolution.</h4>
<p>This really describes how quickly an image of a particular place can be captured multiple times. This is often referred to as &ldquo;repeat time&rdquo;. It is worth noting here that repeat can be gained by having lots of satellites pointing straight down, or by having fewer satellites each supporting a targeting capability i.e. they can point their camera.</p>
<p>There are really two facets to temporal resolution. Firstly the demand might be for an image now, in the event of a natural disaster for instance. Typically in this scenario the targeting capability of a higher spatial resolution satellite is likley attractive. Secondly there might be demand for regularly updating imagery in the event of a monitoring project. In terms of monitoring constellations the idea of "revisit" describes a constellation's ability to capture exactly the same point on Earth over and over again. At this point you might consider that the key product is "change", not pixels.</p>
<p>In the end the update cycle of the user's application determinines the need for different temporal resolutions: do I need a new image every day, or every year?</p>
<p>The spectral mix really determines the <em>product</em>, that&rsquo;s the &ldquo;what&rdquo; of imagery. The spatial resolution determines the quality of the product, and the repeat time is a multiplier.</p>
<p>From these three characteristics a number of product categories can be identified.&nbsp;</p>
<div align="center">
<table border="1" frame="void" cellspacing="0.5" cellpadding="0.5" align="left">
<tbody>
<tr>
<td style="width: 115px;" valign="top">
<p><strong><em>Look at a thing</em></strong><strong><em>&hellip;</em></strong></p>
</td>
<td valign="top">
<p><strong><em>Choose a sensor that can see your thing&hellip;</em></strong></p>
</td>
<td valign="top">
<p><strong><em>Differentiators</em></strong></p>
</td>
</tr>
<tr>
<td style="width: 115px;" valign="top">
<p><strong>ONCE</strong></p>
</td>
<td valign="top">
<p>&nbsp;&hellip; and get an image you can afford.</p>
</td>
<td valign="top">
<p>Spectral, Cost, Spatial</p>
</td>
</tr>
<tr>
<td style="width: 115px;" valign="top">
<p><strong>NOW</strong></p>
</td>
<td valign="top">
<p>&nbsp;&hellip; and get the latest image, perhaps tasking a particular platform if you can afford it.</p>
</td>
<td valign="top">
<p>Spectral, Spatial, Temporal</p>
</td>
</tr>
<tr>
<td style="width: 115px;" valign="top">
<p><strong>LOTS</strong></p>
</td>
<td valign="top">
<p>&nbsp;&hellip;&nbsp; and get images over and over.</p>
</td>
<td valign="top">
<p>Spectral, Temporal</p>
</td>
</tr>
</tbody>
</table>
</div>
<p>&nbsp;</p>
<p>By tweaking these three categories, the commercial satellite imagery comapnies are creating new products. The central idea of the <em>remote sensing product</em> is infact changing quickly. Consider, is the product of remote sensing an image, or is it quantifiable information about our Earth?</p>
<p>The definition of the product, and how that product can be attractive to a wider paying clientele will be where future remote sensing battles are fought.</p>
<h1>Who is Buying?</h1>
<p>Unsurprisingly, the prime consumers of remotely sensed data are governments. Whether the application is military or environmental protection, organizations with a mandate to observe or monitor features on the ground are keen to access timely imagery. Academia has a great interest in studying the ground but typically is less willing to invest large sums of money in a better understanding of the Earth. That said, there is a strong scientific bias in most major satellite providers bringing down the costs to quite a manageable level for research. Bringing down the costs, however doesn&rsquo;t help the satellite company&rsquo;s bottom line.&nbsp;</p>
<p>Increased access to compute (via <a href="https://www.google.ca/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=1&amp;cad=rja&amp;uact=8&amp;ved=0ahUKEwiZ-a21gOfLAhXsloMKHZzLCIwQFggoMAA&amp;url=https%3A%2F%2Faws.amazon.com%2F&amp;usg=AFQjCNGktsHCxc6P03bUh3geeEzhbHcXew&amp;bvm=bv.118353311,d.amc" target="_blank">Amazon Web Services</a>, or <a href="https://www.google.ca/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=1&amp;cad=rja&amp;uact=8&amp;sqi=2&amp;ved=0ahUKEwjd4d_CgOfLAhUHg4MKHWWsC-kQFggwMAA&amp;url=https%3A%2F%2Fcloud.google.com%2Fcompute%2F&amp;usg=AFQjCNH27eYDEWHVohumR3ceFAqU09G7eQ&amp;bvm=bv.118353311,d.amc" target="_blank">Google Cloud</a> etc) is making a difference in our understanding of the <em>remote sensing product</em>. Companies like <a href="http://orbitalinsight.com/" target="_blank">Orbital Insight</a> and <a href="https://spaceknow.com/" target="_blank">SpaceKnow</a> are now able to leverage large amounts of high-resolution data to start answering hard questions.</p>
<p><strong>Questions like:</strong></p>
<p><em>&gt; How are my retail competitors doing?</em></p>
<p>By counting cars in parking lots, it&rsquo;s possible to project financial performance.</p>
<p><em>&gt; How much crude does China have?</em></p>
<p>By looking at the angle and length of shadows from crude storage facilities across China, it is possible to estimate reserves.</p>
<p><em>&gt; How much forest is being harvested in Chile?</em></p>
<p>By looking at change in land cover in high production forest area, it is possible to determine how productive a particular forest operation is.</p>
<p>In microcosm these questions are interesting remote sensing use cases, but when applied to entire Nations they become hugely compelling and indeed valuable commodities. Governments are interested, but the private sector is also paying.</p>
<h1>The Philosophy&nbsp;</h1>
<p>Satellite production has traditionally been philosophically close to aerospace manufacture. In essence this is a reasonable reaction to the huge capital investment in building large, precision spacecraft. Great care is taken in all aspects of engineering the unit, from the design of the optics, to the material selection, to the software development. Every phase is typically designed tested and retested, then tested again, pre-launch. When launching a billion dollar device on the top of a rocket, it seems reasonable to do one&rsquo;s engineering due diligence.</p>
<p>However, this philosophy does have the distinct disadvantage of vastly increasing the barrier to market entry. Frankly there are not many investors interested in taking that kind of risk and not too many companies those investors would trust with this level of execution. Thus it takes time to launch a new satellite, let alone a whole constellation.&nbsp;</p>
<p>DigitalGlobe's 4 year satellite production schedule:&nbsp;</p>
<p><a href="http://investor.digitalglobe.com/phoenix.zhtml?c=70788&amp;p=irol-newsArticle&amp;ID=1464706" target="_blank">WorldView-3 announcement</a>: 08/30/2010&nbsp;<strong>&gt;&gt;</strong>&nbsp;<a href="http://investor.digitalglobe.com/phoenix.zhtml?c=70788&amp;p=irol-newsArticle&amp;ID=1958312" target="_blank">WorldView-3 launch</a>: 08/13/2014</p>
<p>Enter <a href="https://planet.com" target="_blank">Planet Labs</a>. The Planet team have been incredibly successful in disrupting this aerospace model. By changing the rules associated with commercial satellite remote sensing they and been able to turn a 4 year development cycle for new satellite development into a 3-month cycle. Those rule changes include adopting the cubesat standard (see below) and thinking about the satellites as consumables -&gt; lower orbit vehicles which can be launched from the International Space Station but will have a shorter lifespan. This means they can be very agile and aggressive in the addition of new technologies inside their (tiny) satellites. Planet Labs was founded in 2010 and <a href="https://www.planet.com/press/" target="_blank">has launched 113 satellites to date</a>.</p>
<p>Now, let&rsquo;s not ignore that there are some basic optical challenges in the development of such a small optical remote sensing platform. Reaching a point of reasonable spatial resolution with such a restricted area for aperture and focal length being just two. But, what they lack in platform size, they can make up for in iteration cycles and number of satellites in their constellation. With each iteration cycle and launch they are adding more experimental units. Planet is taking the Lean methodology to new heights. They feel that they can reach their stated mission of global daily repeats of anywhere on Earth this year, 2016</p>
<h1>Small but perfectly formed?</h1>
<p>The&nbsp;<a href="https://en.wikipedia.org/wiki/CubeSat">cubesat</a>&nbsp;started life in 1999 under the careful watch of Cal Poly &amp; Stanford. The basic idea is that a satellite of this form is made of "a number of" 10 x 10 x 11.35 cms cubic units. With a common form launching becomes less onerous. Now, not all modern satellites are cubesats, far from it, but this form factor has popularized the idea of small satellites. Indeed one of the key features of the modern space sector is a much easier launch schedule, given the greater number of private and public sector space delivery companies. Organizations like <a href="http://www.spacex.com/" target="_blank">SpaceX</a> who are actively building reusable rockets are significantly reducing the barriers to launching satellites increasing the ability of companies like Planet Labs to launch lots of little satellites.</p>
<h1>The Rise of Consumer Geospatial</h1>
<p>In concert with the development of the small sat phenomena there has been a step change in the geospatial business. This <strong>was</strong> an industry where GIS (Geographic Information Systems) people would sit in basements drawing geographic features for subsequent analysis or mapping.</p>
<p>Then came the smart phone.</p>
<p>Quite suddenly, the geospatial industry has grown into a significant force in the web sector. Understanding where things are and how to get to them is a key value proposition for numerous web and native apps. The basement dwelling GIS person has been pulled, squinting into the daylight and as a result we are seeing an explosion in geospatial capacity on the web. We have companies like <a href="https://mapbox.com">Mapbox</a> and <a href="https://cartodb.com" target="_blank">CartoDB</a> focusing on geospatial web services forcing the older industry players such as <a href="https://esri.com">Esri</a> to raise their web game.</p>
<p>This is change is having a cascading effect on the consumer satellite providers. It turns out that people don&rsquo;t want to buy images anymore.&nbsp;</p>
<p><strong><a href="/blog/the-great-space-race-pt2/">Continue to pt. 2 in this series </a>to find out why new satellite imagery products are not even images anymore<a href="/blog/the-great-space-race-pt2/">.</a></strong></p>PostGIS, CartoDB Bounding Box Trick2015-06-09T03:25:05+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/postgis-cartodb-bounding-box-trick/<p>When pulling out <a href="http://www.cartodb.com" target="_blank">CartoDB</a> or <a href="http://postgis.org/" target="_blank">PostGIS</a> data into a web mapping environment like <a href="http://leafletjs.com/" target="_blank">LeaftletJS</a> or <a href="https://developers.google.com/maps/documentation/javascript/" target="_blank">Google Maps</a>, it is handy to know the bounding box of the data in question. The bounding box is a convenient way to ensure your data is fully visible to the user, no matter what platform they are using or how responsive the viewport is.</p>
<p>A quick way to extract the bounding box from a PostGIS data layer is:</p>
<pre>SELECT st_astext(st_envelope(st_union(the_geom))) FROM &lt;&lt; table &gt;&gt;</pre>
<p>As noted in <a href="/blog/update-with-cartodb-via-postgis/" target="_blank">this blog post</a>, it is possible to do lots of pure PostGISy things directly in the CartoDB UI, this trick above is one of them. Just add the above snippet into the 'SQL' tab, and you will be presented with the same result PGAdminIII or PSQL (the more typical PostGIS interfaces) would provide, &nbsp;a string which looks like:</p>
<pre>[[-122.899495685641,53.9576281860941],[-122.899495685641,53.9727081860345],[-122.859095858743,53.9727081860345],[-122.859095858743,53.9576281860941],[-122.899495685641,53.9576281860941]]</pre>
<p>Now, note that this coordinate string is in the form:</p>
<pre>[[x1,y1],[x2,y2][x3,y3],[x4,y4],[x1,y1]]</pre>
<p>You will see that the first coordinate is repeated; this closes the bounding box. Also, the individual vertex coordinates are in the form of [x,y] NOT [lat,long].</p>
<p>You could do the mind-bending trick of figuring out the appropriate coordinate order to determine a Lat-Long bounding box. Or you could, let the computer figure it out. Below is a quick way to convert the above string into a Leaflet or Google Maps bounding box.</p>
<h4>LeafletJS</h4>
<pre>var envelope = [[-122.899495685641,53.9576281860941],[-122.899495685641,53.9727081860345],[-122.859095858743,53.9727081860345],[-122.859095858743,53.9576281860941],[-122.899495685641,53.9576281860941]]<br /> <br /> var dataBounds = new L.latLngBounds(L.latLng(envelope[0][1], envelope[0][0]))<br /> for (i = 1; i &lt; envelope.length; i++) {<br /> dataBounds.extend(L.latLng(enveolpe[i][1],envelope[i][0]))<br /> }<br /><br />map.fitBounds(dataBounds)</pre>
<h4>&nbsp;</h4>
<h4>Google Maps&nbsp;&nbsp;</h4>
<pre>var envelope = [[-122.899495685641,53.9576281860941],[-122.899495685641,53.9727081860345],[-122.859095858743,53.9727081860345],[-122.859095858743,53.9576281860941],[-122.899495685641,53.9576281860941]]<br /> <br />var dataBounds = new google.maps.LatLngBounds({lat:envelope[0][1], lng:envelope[0][0]})<br />for (i = 0; i &lt; envelope.length; i++) {<br /> dataBounds.extend({lat:enveolpe[i][1],lng:envelope[i][0]})<br />}<br /><br />map.fitBounds(dataBounds)&nbsp;</pre>
<p>Isn't it interesting how similar these code snippets are! Now, you could pump the dataBounds object out to a console.log() and then copy the calculated result back into your map code...</p>
<p><img style="display: block; margin-left: auto; margin-right: auto;" src="/static/media/uploads/bounds.png" alt="" width="500" height="172" /></p>
<p>... alternatively you could leave the calculation in place, but it will be run at every rendering of the map code, obviously not optimal, but somewhat convenient.</p>UPDATE With CartoDB Via PostGIS2015-06-08T16:08:47+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/update-with-cartodb-via-postgis/<p>I had made the assumption that <a href="http://www.CartoDB.com" target="_blank">CartoDB</a> was all about data delivery. Much like the deprecated&nbsp;<a href="https://developers.google.com/maps-engine/" target="_blank">Google Maps Engine</a>, I had assumed CartoDB's value was really in fast data rendering and distribution.</p>
<p>But, there is more.</p>
<p>There is more, primarily because CartoDB uses PostGIS. This means you can use much of PostGIS's analysis capability from the CartoDB user interface. Here is an example: In a <a href="/blog/a-trail-map-for-summer-and-winter/" target="_blank">trail map we recently built</a>, I wanted to be able to indicate that some trails were one-way. We decided to do this using an arrow. Here is the CartoCSS to show the line's directionality:</p>
<pre>#otway_singletrack_clean_1::blaze {<br />&nbsp; line-width: 12;<br />&nbsp; line-color:#F4EFE0;<br />&nbsp; line-opacity: 0.9;<br />&nbsp; line-join: round;<br />&nbsp; line-cap: round;<br />}<br />#otway_singletrack_clean_1{<br />&nbsp; [difficulty="black"] {<br />&nbsp; &nbsp; [one_way = true]{<br />&nbsp; &nbsp; &nbsp; marker-line-width: 0;<br />&nbsp; &nbsp; &nbsp; marker-opacity: 0.9;<br />&nbsp; &nbsp; &nbsp; marker-type:arrow; <br />&nbsp; &nbsp; &nbsp; marker-placement:line;<br />&nbsp; &nbsp; &nbsp; marker-line-color: #000000;<br />&nbsp; &nbsp; &nbsp; marker-fill: #000000;<br />&nbsp; &nbsp; }<br />&nbsp; &nbsp; line-width: 2.5;<br />&nbsp; &nbsp; line-opacity: 0.9;<br />&nbsp; &nbsp; line-color: #000000;<br />&nbsp; }<br />&nbsp; [difficulty="blue"] {<br />&nbsp; &nbsp; [one_way = true]{<br />&nbsp; &nbsp; &nbsp; marker-line-width: 0;<br />&nbsp; &nbsp; &nbsp; marker-opacity: 0.9;<br />&nbsp; &nbsp; &nbsp; marker-type:arrow; <br />&nbsp; &nbsp; &nbsp; marker-placement:line;<br />&nbsp; &nbsp; &nbsp; marker-line-color: #1f78b4;<br />&nbsp; &nbsp; &nbsp; marker-fill: #1F78B4;<br />&nbsp; &nbsp; }<br />&nbsp; &nbsp; line-width: 2.5;<br />&nbsp; &nbsp; line-opacity: 0.9;<br />&nbsp; &nbsp; line-color: #1F78B4;<br />&nbsp; }<br />&nbsp; [difficulty="double black"] {<br />&nbsp; &nbsp; [one_way = true]{<br />&nbsp; &nbsp; &nbsp; marker-line-width: 0;<br />&nbsp; &nbsp; &nbsp; marker-opacity: 0.9;<br />&nbsp; &nbsp; &nbsp; marker-type:arrow; <br />&nbsp; &nbsp; &nbsp; marker-placement:line;<br />&nbsp; &nbsp; &nbsp; marker-line-color: #000000;<br />&nbsp; &nbsp; &nbsp; marker-fill: #000000;<br />&nbsp; &nbsp; }<br />&nbsp; &nbsp; line-width: 2.5;<br />&nbsp; &nbsp; line-opacity: 0.9;<br />&nbsp; &nbsp; line-color: #000000;<br />&nbsp; &nbsp; line-dasharray: 10, 4;<br />&nbsp; }<br />&nbsp; [difficulty="green"] {<br />&nbsp; &nbsp; [one_way = true]{<br />&nbsp; &nbsp; &nbsp; marker-line-width: 0;<br />&nbsp; &nbsp; &nbsp; marker-opacity: 0.9;<br />&nbsp; &nbsp; &nbsp; marker-type:arrow; <br />&nbsp; &nbsp; &nbsp; marker-placement:line;<br />&nbsp; &nbsp; &nbsp; marker-line-color: #33a02c;<br />&nbsp; &nbsp; &nbsp; marker-fill: #33a02c;<br />&nbsp; &nbsp; }<br />&nbsp; &nbsp; line-width: 2.5;<br />&nbsp; &nbsp; line-opacity: 0.9;<br />&nbsp; &nbsp; line-color: #33A02C;<br />&nbsp; }<br />&nbsp; ::labels {<br />&nbsp; &nbsp; text-name: [name];<br />&nbsp; &nbsp; text-face-name: 'Open Sans Bold';<br />&nbsp; &nbsp; text-size: 10;<br />&nbsp; &nbsp; text-label-position-tolerance: 10;<br />&nbsp; &nbsp; text-fill: #575757;<br />&nbsp; &nbsp; text-halo-fill: #FFF;<br />&nbsp; &nbsp; text-halo-radius: 1;<br />&nbsp; &nbsp; text-dy: 0;<br />&nbsp; &nbsp; text-allow-overlap: true;<br />&nbsp; &nbsp; text-placement: line;<br />&nbsp; &nbsp; text-placement-type: simple;<br />&nbsp; }<br />}</pre>
<p>This relies on the order of the verticies of each line, meaning if the trail had been digitised in the wrong way the arrows also point the wrong way, like this:</p>
<p><img src="/static/media/uploads/gnarly_girls_wrong.png" alt="" width="503" height="480" /></p>
<p>Oh dear, that's no good! To resolve this issue I was able to run a simple command to change the direction of the line in question. It looks like this:</p>
<pre class="p1">UPDATE &lt;&lt;table&gt;&gt; SET the_geom = ST_REVERSE(the_geom) WHERE cartodb_id IN (2, 6, 23)</pre>
<p>Let me explain this. The &lt;&lt;table&gt;&gt; is the table of the data layer in question. We are updating the geometry column of the data layer (the_geom) to reorder the verticies of the line in question, using a command called ST_reverse(). Finally, we are providing a list of id's where the lines point the "wrong way".</p>
<p>The end result is:</p>
<p><img src="/static/media/uploads/gnarly_girls_right.png" alt="" width="505" height="490" /></p>
<p>Yay! We did it! CartoDB can be used for pretty serious data manipulation as well as distribution. &nbsp;</p>A Multi Season Trail Map With Mapbox &amp; CartoDB 2015-06-04T05:19:37+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/a-trail-map-for-summer-and-winter/<p>A trail map is seemingly so simple. But, to create a really 'good' trail map we need to consider a few key things:</p>
<p>1) Seasonality - your map needs to tell the story of the adventure in hand. For instance a skiing or showshoeing trail should feel cold and wintery. In this case we needed 2 seaons to be described, summer for the singletrack and winter for the nordic and snowshoeing trails. We wanted to keep these seasons feeling different, whilst not feeling like they are different places. Just like the real seasons. We describe a great deal more about this process <a href="/blog/a-base-map-for-all-seasons/" target="_blank">in a recent blog post about Mapbox base maps</a>.</p>
<p>2) Mobile - a trail map must be very mobile friendly, it will likley be used as a tool outside rather than simply as a strategic planning tool. Consider features like geolocation, also consider keeping the UI free of clutter.</p>
<p>3) Context - what features will really aid the user in their adventure. What can actually be seen given the season and the activity? What places and place names will support the adventurer in their navigation, what are unnecessary?<br />&nbsp;</p>
<p><iframe src="http://snippets.sparkgeo.com/otway" frameborder="none" width="100%" height="500"></iframe></p>
<p>We have tried hard to focus on these ideas in the above map. In building this map we used <a href="https://www.mapbox.com/">Mapbox</a> managed basemaps, <a href="https://cartodb.com/">CartoDB</a>&nbsp;to manage our trail data, and we bound these into a hand rolled <a href="http://leafletjs.com">LeafletJS</a> map. <a href="http://snippets.sparkgeo.com/otway" target="_blank">Check out the full version here</a>.</p>
<p>If you have some more thoughts on trail maps, don't hesitate to comment below!&nbsp;</p>Styling Lines With CartoCSS 2015-06-03T18:32:44+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/styling-lines-with-cartocss/<p>A short example of CartoCSS class-based line rendering.</p>
<p><a href="http://wiki.openstreetmap.org/wiki/CartoCSS" target="_blank">CartoCSS</a> is everywhere it seems. Well, it's in <a href="https://www.mapbox.com/mapbox-studio/" target="_blank">Mapbox Studio</a> and <a href="http://www.cartodb.com" target="_blank">CartoDB</a>,&nbsp;as well as whatever <a href="http://mapnik.org" target="_blank">hand rolled option you choose</a>. (However it is not <a href="/admin/blog/blogpost/45/boundlessgeo.com/2012/11/geoserver-css-module-style-in-style/" target="_blank">geoserver CSS</a>, though it looks somewhat similar). I used CartoCSS to render the below image. It's a nordic ski map and has a number of line style features worth noting.</p>
<p>1) The trails are catagorised green - blue - black to indicate difficulty.<br />2) The trails are labelled.<br />3) Some trails have an outline to indicate it is floodlit.</p>
<p><img src="/static/media/uploads/cartography_1.png" alt="" width="500" height="285" /></p>
<p>To achieve this effect we need to consider the line data. The data contains a "name", a "difficulty" classification, and a "floodlit" boolean. From this we can define the below CartoCSS description. Note, the floodlit ::case must come first, so it is drawn first, if we leave this till the end it will obsure or decolorize, the thinner trail line, by being rendered on top. Additionally, the floodlit style is almost transparent, to give the effect of lighting with out changing the core <em>informational</em> content of the data layer. &nbsp; &nbsp;</p>
<pre>#otway_ski_trails {<br /> line-width: 2.5;<br /> line-opacity: 0.7;<br /> <br /> [floodlit=true]{<br /> ::case {<br /> line-width: 12;<br /> line-color:#FFE403;<br /> line-opacity: 0.2;<br /> }<br /> }<br /> [difficulty="black"] {<br /> line-color: #000000;<br /> }<br /> [difficulty="blue"] {<br /> line-color: #1F78B4;<br /> }<br /> [difficulty="double black"] {<br /> line-color: #000000;<br /> line-dasharray: 10, 4;<br /> }<br /> [difficulty="green"] {<br /> line-color: #33A02C;<br /> }<br /> <br /> ::labels {<br /> text-name: [name];<br /> text-face-name: 'Open Sans Bold';<br /> text-size: 10;<br /> text-label-position-tolerance: 10;<br /> text-fill: #575757;<br /> text-halo-fill: #FFF;<br /> text-halo-radius: 1;<br /> text-dy: 0;<br /> text-allow-overlap: true;<br /> text-placement: line;<br /> text-placement-type: simple;<br /> }<br />}</pre>
<p>The full effect of this style can be seen on our <a href="http://snippets.sparkgeo.com/otway" target="_blank">Otway Nordic Ski Center trail map</a>. This map was built using <a href="/blog/a-base-map-for-all-seasons/" target="_blank">Mapbox base maps</a>, <a href="http://www.cartodb.com">CartoDB</a> line work, all brought together on a&nbsp;<a href="http://leafletjs.com/" target="_blank">LeafletJS map</a>.</p>CartoDB Certified Partner 2015-06-03T17:47:51+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/cartodb-certified-partner/<p>Today, <a href="/" target="_blank">Sparkgeo&nbsp;</a>became a <a href="http://www.cartodb.com">CartoDB</a> certified partner.</p>
<p><img title="CartoDB Logo" src="/static/media/uploads/cartodb.png" alt="" width="500" height="498" /></p>
<p>As a celebration we built an <a href="http://snippets.sparkgeo.com/otway" target="_blank">awesome cartodb powered trail map</a>! Yay!</p>A Base Map For All Seasons - CartoCSS &amp; Mapbox Studio 2015-06-03T04:05:55+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/a-base-map-for-all-seasons/<p><em>A short post looking at tweaking base maps for seasonality using <a href="http://mapbox.com">Mapbox Studio</a>&nbsp;&amp; <a href="https://www.mapbox.com/tilemill/docs/manual/carto/">CartoCSS</a>.</em></p>
<p>We've been duped in our web mapping bubble. While its nice to think about our maps as simply being a canvas for the data we want to visualise. In reality the mapping platform and indeed the base map itself are absolutley critical in setting the tone of a map. The talented team at Mapbox have been <a href="https://www.mapbox.com/blog/alltrails-trail-design/">yelling</a> <a href="https://www.mapbox.com/blog/winter-wonderland/">this</a> <a href="https://www.mapbox.com/blog/ski-maps-update/">message</a> <a href="https://www.mapbox.com/blog/terrain-mapbox-studio/">from</a> <a href="https://www.mapbox.com/blog/eight-new-terrain-baselayer-maps-available-mapbox/">the</a> <a href="https://www.mapbox.com/blog/golf-course-maps/">rafters</a>. Over the last few days I have had to opportunity to drink some of their koolaid.</p>
<p>Folks, I'll be going back for seconds.&nbsp;</p>
<p>My project was simple, a trail map for our local nordic ski centre. In essence a map by which we would be able to navigate the ever growing trail network. <a href="http://caledonianordic.com/">The Caledonia Nordic Ski Centre</a> provides excellent singletrack trails in the summer and worldclass nordic ski trails in the winter. I was looking for a base map to satisfy both these activities. I knew that Mapbox had <a href="https://www.mapbox.com/blog/mapbox-outdoors/">blogged recently about their outdoor tiles</a> so I was curious.</p>
<h3>Mapbox Outdoors</h3>
<p><iframe src="https://a.tiles.mapbox.com/v4/willcadell.e9373139.html?access_token=pk.eyJ1Ijoid2lsbGNhZGVsbCIsImEiOiJKbjZwckU0In0.ET9f2IdpUPpsmZsOc_0T-w#12/53.966/-122.88" frameborder="0" width="100%" height="300px"></iframe></p>
<p>There is no doubt these are very pretty, and would be very suitable for the summer singletrack, but they really don't have the visual connection to the ski trails. A skier knows that the primary linkage with their environment is, well, snow. These outdoor tiles are altogether too "brown" and I needed something altogether whiter. A brief look around brought me to Mapbox's Winter Wonderland tiles. These tiles are as wonderful as promised.&nbsp;</p>
<h3>Mapbox Winter Wonderland</h3>
<p><iframe src="https://a.tiles.mapbox.com/v4/willcadell.43a18a09.html?access_token=pk.eyJ1Ijoid2lsbGNhZGVsbCIsImEiOiJKbjZwckU0In0.ET9f2IdpUPpsmZsOc_0T-w#12/53.966/-122.88" frameborder="0" width="100%" height="300px"></iframe></p>
<p>Though both these base maps are equally as attractive seperately, but they don't flow so well with each other. In this case its mainly about the trees. Around Otway's ski trails there are a great deal of trees. It's Canada, of course there are trees! The winter wonderland tiles are well endowed with trees, our task was simply to copy the tree representation of the <a href="https://www.openstreetmap.org/">Open Street Map</a> landcover "wood" class from the winter wonderland tiles to the outdoors tiles.</p>
<p>In essence this means editing the outdoor tile's CartoCSS in <a href="#landcover%20{%20%20%20[class='wood']%20{%20%20%20[zoom&gt;=4]%20{%20polygon-fill:lighten(@wood,%205);%20}%20%20%20[zoom&gt;=8]%20{%20%20%20%20%20polygon-fill:%20lighten(@wood,%205);%20%20%20%20%20polygon-pattern-file:%20url(img/wood_12.png);%20%20%20%20%20}%20%20%20[zoom&gt;=17]%20{%20%20%20%20%20polygon-fill:%20lighten(@wood,%205);%20%20%20%20%20polygon-pattern-file:%20url(img/wood_18.png);%20%20%20%20%20}%20%20%20}%20%20%20[zoom&gt;=4]%20{%20%20%20[class='scrub']%20{%20polygon-fill:%20lighten(@scrub,26);%20}%20%20%20[class='grass']%20{%20polygon-fill:%20@land;%20}%20%20%20[class='crop']%20{%20polygon-fill:%20@land;%20}%20%20%20[class='snow']%20{%20polygon-fill:%20@snow;%20}%20%20%20//%20fade%20out%20stronger%20classes%20at%20high%20zooms,%20%20%20//%20let%20more%20detailed%20OSM%20data%20take%20over%20a%20bit:%20%20%20[class='wood'][zoom&gt;=14],%20%20%20[class='scrub'][zoom&gt;=15],%20%20%20[class='grass'][zoom&gt;=16]%20{%20%20%20%20%20[zoom&gt;=14]%20{%20polygon-opacity:%200.8;%20}%20%20%20%20%20[zoom&gt;=15]%20{%20polygon-opacity:%200.6;%20}%20%20%20%20%20[zoom&gt;=16]%20{%20polygon-opacity:%200.4;%20}%20%20%20%20%20[zoom&gt;=17]%20{%20polygon-opacity:%200.2;%20}%20%20%20}%20%20%20}%20}">Mapbox Studio</a> (and adding the images, wood_12.png &amp; wood_18.png)</p>
<pre>#landcover {<br /> [class='wood'] {<br /> [zoom&gt;=4] { polygon-fill:lighten(@wood, 5); }<br /> [zoom&gt;=8] {<br /> polygon-fill: lighten(@wood, 5);<br /> polygon-pattern-file: url(img/wood_12.png);<br /> }<br /> [zoom&gt;=17] {<br /> polygon-fill: lighten(@wood, 5);<br /> polygon-pattern-file: url(img/wood_18.png);<br /> }<br /> }<br /> [zoom&gt;=4] {<br /> [class='scrub'] { polygon-fill: lighten(@scrub,26); }<br /> [class='grass'] { polygon-fill: @land; }<br /> [class='crop'] { polygon-fill: @land; }<br /> [class='snow'] { polygon-fill: @snow; }<br /> [class='wood'][zoom&gt;=14],<br /> [class='scrub'][zoom&gt;=15],<br /> [class='grass'][zoom&gt;=16] {<br /> [zoom&gt;=14] { polygon-opacity: 0.8; }<br /> [zoom&gt;=15] { polygon-opacity: 0.6; }<br /> [zoom&gt;=16] { polygon-opacity: 0.4; }<br /> [zoom&gt;=17] { polygon-opacity: 0.2; }<br /> }<br />}</pre>
<p>This was a surprisingly simple change and though Mapbox Studio is a complex interface, making these minor changes was a easy introduction to rolling my own base maps.&nbsp;</p>
<h3>Amended Outdoor Tiles</h3>
<p><iframe src="https://a.tiles.mapbox.com/v4/willcadell.20c8a20a.html?access_token=pk.eyJ1Ijoid2lsbGNhZGVsbCIsImEiOiJKbjZwckU0In0.ET9f2IdpUPpsmZsOc_0T-w#12/53.966/-122.88" frameborder="0" width="100%" height="300px"></iframe></p>
<p>The key here is considering the most effective method of communication for your map. Your base map can be altered in large or small ways to better align that communication. This way your tone can be more consistent, and your maps can look great.</p>
<p><a href="http://snippets.sparkgeo.com/otway" target="_blank">Here are these basemaps in action &nbsp;</a></p>Growth From An Elastic Workforce2015-04-17T23:23:10+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/growth-from-an-elastic-workforce/<p>Growth is hard.</p>
<p>The high growth technology companies are beset with environmental limitations which restrict growth. One of those key restrictions is access to talent, the right talent, and hardest&nbsp;of all: the right talent at the right time.&nbsp;</p>
<p>It's true, growing companies need different people at different times. Meaning that resources which are brought on early must be enormously flexible for instance. The Elastic Workforce (EW) has been touted as a solution to the problems of growth. I first came across this idea in&nbsp;Salim Ismail's excellent "Exponential Organizations" (<a href="http://www.exponentialorgs.com/" target="_blank">http://www.exponentialorgs.com/</a>).</p>
<p>But what is an elastic workforce, and how can it be used?</p>
<p>Consider first our approach to modern cloud services. We plug into a service and the service scales with our demand, without really considering. Assuming we have our business model appropriately scaling we never really feel an increase in cost because cost is proportional to scale.</p>
<p>In essence the elastic workforce is a human cloud service. To leverage the cloud you can ask it to do a couple of things.&nbsp;</p>
<h3>1) Take away this pain.</h3>
<p>You have a particular problem to solve, which might be orthogonal to your business model, so you outsource it. Consider email, I have no interest in running my own email infrastructure. Email is a business expectation, running my own mail server simply adds no value to my business. yet its absolutely necessary to my operations. Ok, I'll give that to gmail.&nbsp;<a href="https://www.sparkgeo.com/" rel="nofollow" target="_blank">Sparkgeo</a>&nbsp;takes this approach to book-keeping and accounting. Its not geospatial or software, so we outsource it. This is the most common way of thinking about the elastic workforce - simple task based contracting.</p>
<h3>2) Scale my operation.</h3>
<p>Another approach to the EW&nbsp;is designed specifically to support growth. Here you have a particular business&nbsp;dependancy where you know you will need talent growth. This could be marketing, or software development, or in&nbsp;<a href="https://www.sparkgeo.com/" rel="nofollow" target="_blank">our</a>&nbsp;purview, geospatial web development. Typically, this talent gap is closer to the expertise differentiator of your business. It is an area you know you want to grow, but you also know you want just the&nbsp;<em>right</em>&nbsp;people. Thus means that hiring will likely be slow and competitive. In this case it is possible to engage a third party to act as your&nbsp;"Elastic Load Balancer" of talent, to use Amazon Web Service's parlance. What this means is that you always have the capacity available to grow, because you have done the groundwork to develop a relationship with a third party. You have a negotiated contract for ad-hoc hours likely with a minimum, but potentially with options for "a few months off" to support any seasonality and you have gone through any development on-boarding / orientation. All this means you can now assign resources relatively instantly.</p>
<p>In considering this model, imagine you have a fairly consistent growth expectation:</p>
<p><img class="center" src="https://media.licdn.com/mpr/mpr/shrinknp_750_750/AAEAAQAAAAAAAAKMAAAAJDY1NTM5ZDUxLWY5MzAtNDA3ZC1iZTM2LTkzM2RlMThkYmIzMA.png" alt="" width="588" height="354" data-loading-tracked="true" /></p>
<p>By leveraging an elastic workforce, you can meet demand whilst ensuring you are not just bringing on bodies for the sake of it. It is easy to break a company culture by bringing on people who are not a good fit&nbsp;because of&nbsp;excessive demand. This way you can pay closer attention to hiring whilst knowing you are not sacrificing your immediate growth potential. Additionally, it is possible to drop resourcing without shedding any hard won full time employees. Layoffs are never fun, this way you can avoid that situation completely.</p>
<p>The Elastic Workforce is a useful&nbsp;tool in the business toolbox of any growing&nbsp;company. Good luck!</p>Startup IP2015-02-20T22:17:12+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/startup-ip/<p>As we have been building our <a href="https://maptiks.com">Maptiks</a> product, our viewpoint on IP has changed significantly. This is a post about our IP journey to date.</p>
<p>Firstly, consider that its not about you.</p>
<p>At <a href="https://www.sparkgeo.com" rel="nofollow" target="_blank">Sparkgeo</a>, in our journey with <a href="https://maptiks.com" rel="nofollow" target="_blank">Maptiks</a> my opinion on Intellectual Property (IP) has changed significantly. My initial reaction was <em>"who cares?"</em> We didn't have near enough cash to defend a patent, and we didn't really have the cash available to go spending it on lawyers fees to get the patent in the first place. Our plan was to stick to 'trade secrets', not waste our time on preparing a patent filing, and assume that the web moves so fast that a patent quickly becomes aged. Our IP strategy was one line:</p>
<blockquote>Move fast; don't let them catch up</blockquote>
<p>I don't believe we are alone in this opinion. Over the past few months as we talked with various investors, IP has became a more persistent theme. Now, we haven't taken any of the investment on offer (as of publishing). But we have filed for our patent and here is why: we realized that...</p>
<blockquote>The patent is not for us</blockquote>
<p>The patent does 3 key things for a startup:</p>
<ol>
<li>It demonstrates that you have done a 'real thing'</li>
<li>It indicates some level of value</li>
<li>Its sets the stage</li>
</ol>
<h3>Validation</h3>
<p>These items might seem somewhat intangible. In our times of <a href="https://www.google.ca/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=1&amp;cad=rja&amp;uact=8&amp;ved=0CDEQFjAA&amp;url=http%3A%2F%2Ftheleanstartup.com%2F&amp;ei=XLPnVMvlNoHsoASS6YHYDg&amp;usg=AFQjCNE6v0iJVxGW-M2uUTtZDfvyMdGj4w&amp;bvm=bv.86475890,d.cGU" target="_blank"><em>lean startup</em></a> they might also feel utterly superfluous. However, it is important to remember that the community is looking to you to perform at your chosen venture. Being able to demonstrate that you have built something novel has enormous value in the the eyes of both customers (who give you money for your thing) and investors (who give you money to find more customers for your thing). An external validation of your technology is useful here. No one like the feeling of being sold snake oil, but the patent demonstrates that your technology has real weight.</p>
<h3>Valuation</h3>
<p>Valuation is a somewhat obvious argument, but it is certainly worth mentioning. On the assumption that your company is looking for investment, then getting a patent beefs up your valuation. This means typically you will get better value from your investors. In addition, your potential investors will feel more comfortable with the magic story you tell of your technology, because the patent office corroborates your story. In a pre-money calculation a patent can be worth $50 - $100k, depending on who you talk to, and indeed what country you are in. So in terms of investors the patent both validates your technology and gets you better value from the investment.</p>
<h3>The stage</h3>
<p>The stage is where you present your idea to your community and the (potentially larger players) start moving on it. Assuming you are doing something new and different and that you prove your thesis in terms of customer uptake, you will be in a position to be copied. Is your technology something that really can't be copied?</p>
<p>As a startup with limited revenue, you likely can't sue a large company for any kind of patent infringement (we talked about that above). That would be unwise and a quick route to insolvency. However there are other tools available, and being able to use your IP as a bargaining chip might provide a much quicker discussion on either acquisition or licensing.</p>
<p>The point here is that the patent opens doors. It is a route to opportunity in the face of what might seem like enormous and heavy-weight competition. In the event of discussions over technology and IP, your trade secret is worth nothing if any of your engineers are head hunted, but your patent remains hugely valuable.</p>
<p>So no, it's not necessarily about <em>you</em>, or what you can do with the patent. It's about how that patent can be leveraged by your ecology of partners, whether they be investors, licensees or your potential exits. If you are genuinely doing something new, then your patent is an enabler of growth for your company.</p>
<p>All that said, the patent is not without cost, for us it has been about a month of burn, so consider it carefully, as you would any investment in your company.</p>
<p>For these 3 reasons, we have decided to take the investors' advice and peruse a patent, even though, as yet we haven't pursued their actual investment.</p>Technology is great, but invest in data.2015-02-16T17:00:58+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/technology-is-great-but-invest-in-data/<p>I love&nbsp;<strong>technology</strong>. I get wowed by amazing new computing platforms, new technology experiences, new ideas about how to interact with machines and computers. I get thoroughly dazzled by how clever people are and how the right group of people can create the most amazing things.</p>
<p>But technology is like the weather in Scotland; wait a few minutes and it will change. The great technology companies really leverage this cycle of change, getting their lusting customer base to buy new products again and again (I count myself among this lustful demographic). But my point here is not that customers are easily coerced into buying the next shiny object, that is a post still to come. The point here is that technology is&nbsp;<strong>transient</strong>.</p>
<p>Web frameworks change, databases change, technology evolves, continuing to prove&nbsp;<a href="http://en.wikipedia.org/wiki/Moore%27s_law" rel="nofollow" target="_blank">Moore's Law</a>. This is all pretty awesome, yeah?</p>
<p>Yes, but... Every investment you make in technology is also immediately out of date. Think about that. As a decision maker you can't just wait for the next thing to arrive, you must eventually choose a platform, storage mechanism, or whatever. But, you will be subject to changing trends and disruptive innovation.</p>
<p>However, the actual&nbsp;<strong>data</strong>&nbsp;you store is different. The data itself being the columns and rows, the documents, the images, the raw 1s and 0s. In this regard, your data is the complete opposite of technology. Every day your data exists it accrues more value. Every day you capture more data, your data's value also grows. The usefulness of the data is of course contextual and the actual dollar value difference between data sets will vary wildly. In general however, many data points are more useful than only a few. Additionally, a data set which has grown over the last ten years is more useful than one which is only a few hours old. Of course this is&nbsp;<a href="/blog/scale-in-a-time-of-web-maps/" rel="nofollow" target="_blank">subject to the complex idea of scale</a>&nbsp;both in spatial and temporal terms.</p>
<p>In terms of cost-benefit, we are left in an interesting quandary. Perhaps the value of our technology is greater if it allows a greater depth of data to be captured? I will let the economists figure that out. For now, be left with this thought: you will consistently update technology and you will call these 'investments', we must do this to stay competitive, in real terms these investments will decrease in time. However the real investment you make will be in your data which will increase in value over time.</p>
<h3>Investing in technology is like buying a car; investing in data is like buying a house.</h3>Scale in a time of web maps2015-02-04T18:39:21+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/scale-in-a-time-of-web-maps/<p><em>Scale</em> has taken on a completely new meaning for me. In my training and early career, scale referred to a conversion measurement indicating a comparison between a measurement on a paper map and a measurement in the real world. The big 'thing' about GIS was that it was scale-less; you could zoom in as much as you wanted and the map changes accordingly, amazing!</p>
<p>The word <em>scale</em> for me is now a combination of a number of concepts, some old and some new. The idea of the conversion between a screen measure and a real world measurement is still pertinent. Though in web mapping parlance this term has some-what devolved to the term "zoom" or "zoom-level" which on reflection is a horrible degradation, though usefully user-centric.</p>
<p>In general, the term <em>scale</em> for me is now more about data than it is about display. In web terms when we talk about <em>scale</em>, and we refer to the size of data the enormity of a repository, database or storage engine. If one 'gets to <em>scale</em>' then you receive your badge of honour and its implied you have figured out how to manage ever larger amounts of data and can do something useful with it. Of course the 'doing something useful with it' means you typically have competence around display or management.</p>
<p>In web mapping terms <em>scale</em> here can be about how to draw gazillions of features on a map, however, not necessarily how to <em>usefully</em> draw gazillions of points on a map.</p>
<p class="right"><img class="center" src="https://media.licdn.com/mpr/mpr/p/7/005/0b5/07a/0b36341.jpg" alt="" width="588" height="521" data-loading-tracked="true" /></p>
<p class="right"><em>Oh, great lots of data, thanks...</em></p>
<p>At <a href="https://sparkgeo.com" rel="nofollow" target="_blank">Sparkgeo</a> we have worked with numerous companies who deal with <em>scale</em> regularly. What I have discovered is one of the great conceits of our modern web mapping life:</p>
<blockquote>Just because you can draw a gazillion points on a map, does not at all mean that you should.</blockquote>
<p>In fact, the decision to draw anything on a map needs to take into consideration both the traditional understanding of <em>scale</em>. At what geographic density does it make sense to draw the features on the map? But also balance that with the <em>scale</em> at which the data is relevant. This characterization of data <em>scale</em> having an effect on an analytical outcome has always been a central feature of traditional GIS analysis. In our modern life of geospatial applications, <em>it is very easy to forget we are still applying traditional GIS concepts</em> albeit within a different purview and with new technologies.</p>
<p>As such, its easy to forget that data scale and indeed data quality have a direct impact on the algorithmic quality of whatever we are doing. It took till version 5.6 for MySQL to consider that geographic analysis should be considered beyond the Minimum Bounding Rectangle, we're only on 5.7 now.</p>
<p>So wait, our technology does matter to <em>scale</em> then? Yes it does, especially when it constrains your data's ability to be functional at a certain <em>scale</em>, even if it is to meet the demands of <em>scale</em>. If your technology constrains your data's ability to perform, then your data if defined by your technology. So your <em>scale</em> is limited by your <em>scale</em>.</p>
<p class="left">Yup, its getting pretty murky, I agree.</p>
<p class="left">But instead of clarifying, because frankly there is no clarity here, consider this: what scale is your crowdsourced geospatial data? This question is beautifully complex. Your scale will be determined by a mix of context, application, device, and storage technology variables. Most interestingly, it is this mix within a single data source. An example of this complexity is the ease with which one can programatically to switch from device GPS to GeoIP depending on GPS signal availability. This means within a single table the variation of geographic accuracy is between 5 meters and city / regional. Again, just because you can collect data in this manner, this does not mean you should; there is a significant risk of breaking algorithmic expectations. Your local search is useless if you are basing user location on GeoIP, its not very local.</p>
<p class="left">This variety of seemingly structured but hugely variable data is a new feature of our industry getting to <em>scale</em>. Tread carefully, however. We have enormous opportunity to build geospatial applications which can change lives. It is very easy to get tied up in the joy of solving our <em>scale</em> problem, whilst forgetting that we are trying to capture, manipulate and display that data at entirely the wrong scale.</p>
<p class="left">Old problems are new again.</p>
<p class="left">&nbsp;</p>
<p class="left"><a href="https://www.linkedin.com/pulse/scale-time-web-maps-will-cadell">This post was originally published as a LinkedIN post</a></p>More to maps than markers2015-02-04T18:31:20+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/more-to-maps-than-markers/<p><a href="https://developers.google.com/maps/" rel="nofollow" target="_blank">Google Maps</a> have been with us since February 2005, virtually a decade ago as I write this (expect a party on Feb 8th 2015). From their acquisitions of Where 2 Technologies and Keyhole, Google ignited a cartographic web revolution. Yes, of course there were maps on the internet pre-Google Maps, but Google brought the web map to the consumer, and has driven the expectation of location intelligence in web and mobile apps ever since.</p>
<p>Google Maps is undoubtedly a wildly successful phenomena and will continue to change the face of search. The internet is better for it. This article, however is a call to action, because <strong>web cartography is more than a solitary red marker on a map</strong>.</p>
<p class="center"><img class="center" src="https://media.licdn.com/mpr/mpr/p/7/005/0b3/084/1f72f9b.png" alt="" width="573" height="267" data-loading-tracked="true" /></p>
<p class="center"><em>Look at that lonely marker, poor thing.</em></p>
<p>Web technology is advancing. We can now do so much more than we could a decade or even a year ago. I see some amazing examples of our cartographic technology evolving, but I do not see its greater adoption. Geospatial technology and indeed GIS have typically been backroom, analytical activities that appeared to promise a bright future, but never quite deliver.</p>
<p>Well, that future is here folks, its delivered. Now its up to us. Don't worry though, we have some amazing tools at our disposal now. Below are some examples:</p>
<h3><a href="https://developers.google.com/maps/documentation/javascript/reference" rel="nofollow" target="_blank">Google Maps API</a></h3>
<p>Sure, Google provided started the red-marker-itis, but in reality its more than possible to make your Google Maps very individual, but styling your own cartography, adding unique interactions and creating your your own vector iconography. There is a huge amount of detail in the <a href="https://developers.google.com/maps/" rel="nofollow" target="_blank">Google Maps API documentation</a>, but also look on <a href="https://www.github.com" rel="nofollow" target="_blank">github</a> and you'll find exciting tidbits like <a href="https://github.com/brendankenny/CanvasLayer" rel="nofollow" target="_blank">canvas layers</a> and the <a href="https://github.com/googlemaps" rel="nofollow" target="_blank">utility libraries</a>.</p>
<p class="center"><img class="center" src="https://media.licdn.com/mpr/mpr/p/6/005/0b3/327/2d028a4.png" alt="" width="503" height="362" data-loading-tracked="true" /></p>
<p class="center"><em>We can do much more with Google Maps then red markers</em></p>
<h3><em><a href="http://www.mapbox.com" rel="nofollow" target="_blank">MapBox</a></em></h3>
<p>MapBox has made a tremendous impact on the geospatial web space in recent years. From seemingly nowhere (but actually the international development community), they are now blazing the way for hosted and beautifully styled geospatial data. Mapbox is a real competitor to Google Maps for those who want absolute cartographic control over features and presentation, with a variety of beautifully rendered layers available as base maps. They do all this and are <a href="https://www.mapbox.com/about/open/" rel="nofollow" target="_blank">open source</a> too.</p>
<p class="center"><img class="center" src="https://media.licdn.com/mpr/mpr/p/5/005/0b3/329/34cdc1c.png" alt="" width="521" height="459" data-loading-tracked="true" /></p>
<p class="center"><em>An amazing example of web map (pop) art from MapBox</em></p>
<h3><a href="http://cartodb.com/" rel="nofollow" target="_blank">CartoDB</a></h3>
<p>Coming from a similar philosophical place as Mapbox, but with a little more analysis and a little less cartography, CartoDB is absolutely a modern web mapping platform to be considered. Reputed to have enormous data capacity and focused on development oriented users CartoDB is well able to fill the gap left by <a href="http://blog.cartodb.com/gme-to-cartodb/" rel="nofollow" target="_blank">Google Maps Engine's recent deprecation</a>.</p>
<p class="center"><img class="center" src="https://media.licdn.com/mpr/mpr/p/6/005/0b3/329/39ef13f.png" alt="" width="531" height="409" data-loading-tracked="true" /></p>
<p class="center"><em>Powerful spatial analysis in the browser with CartoDB</em></p>
<p>It should also be noted that both MapBox and CartoDB are enterprise ready. These are not small mapping companies with uncertain futures, they are both robust and have excellent technology. Additionally its worth noting there are other alternatives still, including <a href="http://www.giscloud.com/" rel="nofollow" target="_blank">GISCloud</a>.</p>
<h3><a href="http://www.osgeo.org/" rel="nofollow" target="_blank">Open Source</a></h3>
<p>If you have development capacity, you should certainly consider building a solution yourself from the plethora of open source geospatial technology available. Of particular note are:</p>
<ul>
<li><a href="http://postgis.net/" rel="nofollow" target="_blank">PostGIS</a>, the geospatial data base of choice (open or closed source)</li>
<li><a href="http://mapnik.org/" rel="nofollow" target="_blank">Mapnik</a>, tile renderer to the gods</li>
<li><a href="http://leafletjs.com/" rel="nofollow" target="_blank">Leaflet</a>, lightweight mapping framework</li>
<li><a href="http://openlayers.org/" rel="nofollow" target="_blank">OpenLayers (3)</a>, heavier-weight mapping framework</li>
<li><a href="http://geoserver.org" rel="nofollow" target="_blank">Geoserver</a>, enterprise ready map server</li>
<li><a href="http://www.gdal.org/" rel="nofollow" target="_blank">GDAL</a>, geospatial translation library (the glue of open source web mapping)</li>
</ul>
<p>Another notable source of inspiration is <a href="http://maps.stamen.com" rel="nofollow" target="_blank">Stamen Web Design's map tile</a> page. These tiles are usable in most modern mapping frameworks under liberal (and free) license.</p>
<p class="center"><img class="center" src="https://media.licdn.com/mpr/mpr/p/6/005/0b3/327/34bb742.png" alt="" width="487" height="329" data-loading-tracked="true" /></p>
<p class="center"><em>The beautiful watercolour map tiles from Stamen</em></p>
<h3>Don't get freaked out!</h3>
<p>Because the future of web mapping is such that we will have even more choice, that our data sources will all be able to talk to each other and that we will all get to enjoy and expect more beautiful and more functional maps, and hopefully fewer solitary red markers.</p>
<p>&nbsp;</p>
<p>If you are needing some help, <a href="https://sparkgeo.com" rel="nofollow" target="_blank">just shout</a></p>
<p>&nbsp;</p>
<p><a href="https://www.linkedin.com/pulse/more-maps-than-markers-will-cadell">This post was originally published as a LinkedIN post</a></p>Beware the Echo Chamber2015-02-04T18:24:07+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/beware-the-echo-chamber/<p>There is a concept of spatial auto-correlation in GIS (Geographic Information Systems). This is a measure of the degree to which a set of spatial features and their associated data values tend to be clustered together in space (<a href="http://support.esri.com/en/knowledgebase/GISDictionary/term/spatial%20autocorrelation">reference</a>). So in small words: similar things tend to hang out together.</p>
<p>This is often true of friends and professionals, and indeed friendly-professionals. We typically gravitate towards people who have similar skills, similar world views and similar opinions. Its both validating and pleasant to talk with people who generally agree.</p>
<p>I am here today to tell you this is dangerous. As a business person this is dangerous, and as a citizen this is dangerous.</p>
<p>At <a href="http://sparkgeo.com">Sparkgeo</a> we have been launching a product called <a href="http://maptiks.com">Maptiks</a> which is an analytics layer for web maps. For the last few years we have been in our own geospatial echo chamber. We have been talking to geospatial developers; helping them build cutting edge mapping technology; stretching the boundaries of what has previously happened with maps on the web. We talk about web maps and breathe data. Thus, we built our own analytics platform to scratch our own itch: "how good are our own maps?" There is no doubt we have a strong and growing market, but there are a large number of people who simply do not care as much about maps as we do. Some people don't care at all!</p>
<p>You know? That's ok. Not everyone has to like your product. In fact not everyone has to even understand your product. That said, the people who do understand your product's value must love it!</p>
<p>Considering the echo chamber here is hugely valuable. It can define your market, it can define your "low hanging fruit", but it could easily be misleading you. As a business you must ensure that your echo chamber is big enough to support your needs.</p>
<p>Better, perhaps you should consider stepping outside the chamber and testing your hypothesis against less friendly opinion. Having a place to go to test your opinions against different world views is essential. Do you feel you have access to diverse thinking? I would argue that there is always one more new (perhaps crazy) opinion.</p>
<p>For us at Sparkgeo this has been a critical step forward in our thinking concerning Maptiks. We have realized that users must first feel the value of their web map before they want to optimize that value through instrumentation and analytics. Apparently many companies have been happy enough just to get a functional map on their website. So happy, in fact that they stopped there. Our echo chamber had misled us into thinking that everyone hugely valued their web maps. For a great many organizations having a web map was cool, but it didn't necessarily "turn the dial" in itself. This leaves now planning to take our future customers through an education process, even prior to describing Maptiks' s value.</p>
<p>Now we are stepping out of our echo chamber into a much bigger world; under our arm however, we now have a much bigger and better plan. <a href="https://www.linkedin.com/pulse/beware-echo-chamber-will-cadell">This post was originally published as LinkedIN post</a></p>Making Maps at Web Scale (UNBC)2015-01-26T05:35:17+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/making-maps-at-web-scale-unbc/<h2>&gt; Lets make some maps!</h2>
<h3>Google Maps!</h3>
<p><a href="http://jsfiddle.net/sparkgeo/eskyqq1s/3/" target="_blank">Google Maps</a></p>
<p>Try changing the zoom level and moving the starting location</p>
<p><a href="http://jsfiddle.net/sparkgeo/eskyqq1s/2/" target="_blank"> Google Maps with Marker </a></p>
<p>Try changing the location of the point</p>
<p><a href="http://jsfiddle.net/sparkgeo/L1dausmm/" target="_blank">Google Maps with Data </a></p>
<p>Try editing the data...!</p>
<hr />
<h3>Leaflet Maps!</h3>
<p><a href="http://jsfiddle.net/sparkgeo/zv1bta4v/2/" target="_blank">Leaflet Map</a></p>
<p>Try changing the location of this map! Then go to <a href="http://maps.stamen.com">Stamen</a> and change the background to watercolor!</p>
<p><a href="http://jsfiddle.net/sparkgeo/4116hcq0/2/" target="_blank">Leaflet with Data</a></p>
<p>Try changing the colors of the gradient... google color brewer!</p>
<p><a href="http://jsfiddle.net/sparkgeo/4do3mgru/1/" target="_blank">Leaflet with Data &amp; Interaction</a></p>
<p>Wow, this is turning into a real app!</p>Better web maps with map analytics 2015-01-20T00:38:52+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/better-web-maps-with-map-analytics/<p><strong>Analytics</strong> are a great tool for monitoring and reporting on website traffic and user interactions. It is reasonablely expected that, as a web developer I will use analytics to better understand my users and deliver them with incrementally improving services and experiences.</p>
<p>At <a href="/">Sparkgeo</a>&nbsp;we build web maps. We have been doing this for half a decade and we have watched this space evolve over that time. The idea of interactive maps is not new by any means. However the idea that we should instrument our maps in a manner to actually derive meaningful analytics from them has been entirely missed. It is interesting, but seemingly most of us have been happy to just get a functional map on our websites, let alone using it as a tool for user discovery.</p>
<p>So what are we missing? Starting at the beginning, here are some very simple examples:</p>
<ol>
<li>What is your map bounce rate? How many people visit your map, or the page with your map and do absolutely nothing with map?</li>
<li>Conversely, do you have an enormous number of activities on your map?</li>
<li>Do 99% of your users zoom in immediately?</li>
<li>Where on the map are users navigating to?</li>
</ol>
<p>With the above cases it is possible to use the captured data to drive real business benefits:</p>
<ol>
<li>Can you make do with a static map? This could result in lower licence fees and a lower client weight (less js)&nbsp;</li>
<li>Is the map much more valuable to users than the management thought? Now you understand the value, can you add conversion opportunities to info boxes?&nbsp;</li>
<li>By optimizing the landing zoom, can you eliminate user barriers to discovery and conversion?&nbsp;</li>
<li>By understanding where your users are looking is it possible to provide a more accurate and faster business product? This data can inform business analysts of market interest, web developers of caching strategies, and designers of UX problems and opportunities.&nbsp;</li>
</ol>
<p>There are enormous benefits to measuring your web map. The biggest benefits of all will be in the knowledge you gain of your users and the knowledge that your mapping technology is doing what you expected it to. Demonstrating your map's ROI, and justifying its presence in your website rather having the map just sitting there with you hoping it is serving its purpose. Your map should be driving value for your business or service, is it?</p>
<p>With <a href="/projects/maptiks">Maptiks</a>, map analytics are getting much easier too.</p>Disparate data, technology fiefdoms (and 65 pictures of your cat) - Video2014-09-23T17:57:34+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/disparate-data-technology-fiefdoms-and-65-pictures-of-your-cat/<p>Will Cadell (<a href="https://twitter.com/@geo_will">@geo_will</a>) presented at FOSS4G 2014. He talked about the importance of considering the <em>commonwealth of data</em> when publishing open data products. Check out his presentation here &gt;&gt;&gt;</p>
<p><iframe src="http://player.vimeo.com/video/106869354?title=0&amp;byline=0&amp;portrait=0&amp;color=ff9933" frameborder="0" width="667" height="375"></iframe></p>
<p class="first"><em>Presentation Description</em></p>
<p class="first">Up in the frozen wastes of the Northern British Columbia, we organized a hackathon. We based it on the ideas of open data and civic applications.</p>
<p>Our hardy hackathoners pulled together a number of excellent ideas but met with a constant and obtrusive barrier: that open data maybe open but with out some level of standardization its not actually very useful.</p>
<p>Now, no one said that data had to be 'useful', and perhaps if we want the technology utopia of real open data interoperability we will need to "build it" ourselves, but it is worth noting that talking the same language as our neighbours is generally awesome. Indeed, perhaps rather than swearing fealty to our technology overlords and just pressing the "publish document to open data platform" button, we could think about the commonwealth of data. The value of any data increases wildly with density and open data should be more valuable!</p>
<p>The cats? well you'll have to tune in for that bit.</p>Build a better map with analytics -- video2014-09-17T21:46:45+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/build-a-better-map-with-analytics-video/<p>Dustin Sampson (<a href="https://twitter.com/@gridcell">@gridcell</a>), one of our geospatial developers presented the core ideas behind tracking slippy map analytics at FOSS4G in September (2014, Portland). Check out his presentation here &gt;&gt;&gt;</p>
<p><iframe src="http://player.vimeo.com/video/106234088?title=0&amp;byline=0&amp;portrait=0&amp;color=ff9933" frameborder="0" width="667" height="375"></iframe></p>
<p class="first"><em>Presentation Description</em></p>
<p class="first">Google Analytics is a great tool for monitoring and reporting on website traffic and user interactions but what it doesnt tell you is that 75% of the time your users zoom in two levels every time they start to use your map or that external soils layer you added is taking an average three seconds to load. Client side map monitoring adds the missing chapters needed to complete your geo-analytics storybook.</p>
<p class="first">We'll briefly walk-through how to setup your slippy map to start tracking analytics, what can be tracked, and what can be discovered.</p>Sliptics, spatial analytics to help you build a better map.2014-09-13T01:50:04+00:00sparkgeo/blog/author/sparkgeo/https://www.sparkgeo.com/blog/sliptics-spatial-analytics/<p><img src="/static/media/uploads/sliptics_logo_sm.png" alt="" width="602" height="200" /></p>
<p>&nbsp;</p>
<p>The modern web is in the throws of a very public love affair with data.</p>
<p>It is not really a surprise as it is data and it's sharing that spawned the internet. Data and its most recent guise&nbsp;<em>analytics</em> has become the cornerstone of web development and administration. So, why don't&nbsp;we apply data to the web map experience?</p>
<p>Well, for a start it's not very easy to measure and understand web mapping analytics.&nbsp;Additionally, we have to this point been surprisingly satisfied with just having functional web&nbsp;mapping experiences. Now, with the advent of very robust tile serving environments and a real&nbsp;choice in 'good' mapping fabrics and cartographies we are in a place where it's quite useful to&nbsp;be able to measure and ultimately compare user engagement on web maps.</p>
<p>At Sparkgeo we only do geospatial web development, with that in mind we have an interest in&nbsp;understanding which cartographies and frameworks make the most sense given a series of&nbsp;client based criteria. We deeply understand the situation where the developer is faced with a&nbsp;startlingly wide variety of choices and is really only left to throw a dart at a spinning globe of web&nbsp;mapping choices hoping that things work out. Experience has taught us that little changes can&nbsp;make big differences to the user's experience. The biggest problem is measuring the users&nbsp;experience so you can properly understand the change that's needed. That is why we built what&nbsp;has become sliptics, slippy map analytics.</p>
<p>This is the first in a series of posts where we will introduce you to sliptics, how and why we built&nbsp;it, and how you can make the most of it in your maps.</p>
<p>If you are interested in following our progress then don't hesitate to&nbsp;<a title="Sliptics Spatial Analytics" href="http://www.sliptics.com" target="_blank">sign up</a>&nbsp;now.</p>Mapping Exif Files2014-04-10T20:56:14+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/mapping-exif-files/<p>This is a quick note on how to map images in a directory purely from their <a href="http://en.wikipedia.org/wiki/Exchangeable_image_file_format">exif</a> files. An exif file is one which contains the meta data of the image. If the image was taken by a phone of GPS enabled camera, then a location will also be contained within that file. To read that directory on the webserver we need to use a <a href="http://en.wikipedia.org/wiki/Common_Gateway_Interface">CGI script</a>. For our purposes we will use python.</p>
<p>This is really a very simple concept:</p>
<ol>
<li>have a folder space on a webserver</li>
<li>set that server up to to understand .py files as CGI scripts</li>
<li>have a directory full of images</li>
<li>have a web page which pulls up a <a href="https://developers.google.com/maps/documentation/javascript/tutorial">Google Map</a></li>
<li>have that map requst the geography of each image (via ajax and the CGI script)</li>
<li>display the locations of the images on the map</li>
</ol>
<p>This idea is simple, but requires some fiddling. Firstly you need to make your apache webserver execute python scripts. Thankfully there are <a href="https://docs.python.org/2/howto/webservers.html">numerous</a>&nbsp;<a href="http://stackoverflow.com/questions/15878010/run-python-script-as-cgi-apache-server">resources</a>&nbsp;<a href="http://webpython.codepoint.net/cgi_tutorial">to</a>&nbsp;<a href="http://www.w3resource.com/python/cgi-programming.php">help</a>&nbsp;in this. With that in mind I will assume you now have a space on a webserver in which a python script can be executed. next make a directory called 'images' and put your images in it.</p>
<p>Next we will need a very simple web page. Ours will just have a single div called "map"</p>
<pre>&lt;html&gt;<br /> &lt;head&gt;<br /> &lt;script src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"&gt;<br /> &lt;/script&gt;<br /> &lt;script src="https://maps.googleapis.com/maps/api/js?v=3.exp&amp;sensor=false"&gt;<br /> &lt;/script&gt;<br /> &lt;style&gt;<br /> html, body, #map {<br /> height: 100%;<br /> margin: 0px;<br /> padding: 0px;<br /> }<br /> &lt;/style&gt;<br /> &lt;/head&gt;<br /> &lt;body&gt;<br /> &lt;div id="map"&gt;&lt;/div&gt;<br /> &lt;/body&gt;<br />&lt;/html&gt;</pre>
<p>And, we'll need a smidge of JavaScript to display a Google Map, ask the python script for the coordinates from the images' exif files and then provide listeners for the info boxes.&nbsp;</p>
<pre>&lt;script&gt;<br /> <br />&nbsp; &nbsp; $.ajax({<br />&nbsp; &nbsp; url: "get_images.py",<br /> type: "GET",<br /> dataType: 'json',<br /> success: function(data){<br /> initialize_map(data)<br /> }<br /> });<br /> <br /> function initialize_map(data){<br />&nbsp; &nbsp; var myLatlng = new google.maps.LatLng(-25.363882,131.044922);<br /> var mapOptions = {<br /> zoom: 4,<br /> center: myLatlng<br /> };<br /> <br /> var bounds = new google.maps.LatLngBounds();<br /> var map = new google.maps.Map(document.getElementById('map'), mapOptions);<br /> <br /> for (var image in data){<br />&nbsp; &nbsp; lat = data[image].loc.split(',')[0]<br /> lon = data[image].loc.split(',')[1]<br /> var pos = new google.maps.LatLng(lat,lon);<br /> marker = new google.maps.Marker({<br /> position: pos,<br /> map: map <br /> });<br /> bounds.extend(pos)<br /> <br /> attachInfo(marker, data[image].image)<br /> <br /> function attachInfo(marker, image) { <br />&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; var infowindow = new google.maps.InfoWindow({<br />&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; content: '&lt;img src="images/' + image + '" width=200/&gt;'});<br />&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; google.maps.event.addListener(marker, 'click', function() {<br /> infowindow.open(map,marker);<br /> }); <br /> })<br /> }<br />&nbsp; &nbsp; map.fitBounds(bounds);<br /> }<br /> &lt;/script&gt;</pre>
<p>This provides a simple ajax using <a href="https://api.jquery.com/jQuery.ajax/">jquery</a> request to the python script. It doesn't have to do anything clever, it just says "run your get_images.py" script. In return its expecting a json object with image names and locations. The image names returned help build the info windows (by pulling in images to them) and the returned locations add the markers to the map.&nbsp;</p>
<p>The final piece of the puzzle is the python script.</p>
<pre>#!/usr/bin/env/python<br />
# -*- coding: UTF-8 -*-<br />
import cgitb<br />import cgi<br />import os, sys<br />import mimetypes<br />import exifread<br />import json<br />from fractions import Fraction<br /><br />cgitb.enable()<br /><br />print "Content-Type: application/json"<br />print<br /><br />source = "/path/to/your/images"<br />image_locate = {}<br />directory = os.listdir( source )<br /><br />def _get_if_exist(data, key):<br /> if key in data:<br /> return data[key]<br />return None <br /><br />def _convert_to_degress(value):<br /> """Helper function to convert the GPS coordinates stored in the EXIF to degrees in float format"""<br /> d = float(Fraction(str(value.values[0])))<br /> m = float(Fraction(str(value.values[1]))) <br /> s = float(Fraction(str(value.values[2])))<br />return d + (m / 60.0) + (s / 3600.0) <br /><br />def get_lat_lon(exif_data):<br /> """Returns the latitude and longitude, if available, from the provided exif_data (obtained through get_exif_data above)"""<br />&nbsp; lat = None<br />&nbsp; lon = None <br /> gps_latitude = _get_if_exist(tags, "GPS GPSLatitude")<br /> gps_latitude_ref = _get_if_exist(tags, 'GPS GPSLatitudeRef').values<br /> gps_longitude = _get_if_exist(tags, 'GPS GPSLongitude')<br /> gps_longitude_ref = _get_if_exist(tags, 'GPS GPSLongitudeRef').values<br /> <br /> if gps_latitude and gps_latitude_ref and gps_longitude and gps_longitude_ref:<br /> lat = _convert_to_degress(gps_latitude)<br /> <br /> if gps_latitude_ref != "N":<br /> lat = 0 - lat<br /> lon = _convert_to_degress(gps_longitude)<br /> <br /> if gps_longitude_ref != "E":<br /> lon = 0 - lon<br /> <br /> return "%s, %s" % (lat, lon)<br /><br /> for file in directory:<br /> mt = mimetypes.guess_type(file)[0]<br /> if mt:<br /> p = os.path.join(source, file)<br /> f = open(p, 'rb')<br /> tags = exifread.process_file(f)&nbsp;<br /> image_locate[file] = get_lat_lon(tags)<br /><br />print json.dumps([{'image': k, 'loc': v} for k,v in image_locate.items()], indent=4)</pre>
<p>This file depends on the <a href="https://pypi.python.org/pypi/ExifRead">exifread library</a>&nbsp;which very conveniently does all the heavy lifting in the exif reading. That read all we have to do is drop into our images folder then read through the files, check their mimetypes, if they are images then look for geography. The geography is the converted back from rational number representations of Degrees, Minutes, Seconds to the more versitile Decimal Degrees and finally passed back in a json objet to the javascript and the map.</p>
<p>There is little validation here and this is obviously just proof of concept code, but a useful example I think.</p>
<p>The code above assembles to look like this:</p>
<p><iframe src="http://helpwanted.sparkgeo.com/helpwanted.html" frameborder="0" align="right" width="640" height="481"></iframe></p>
<p>&nbsp;</p>Turning Down the Noise with Nearblack2014-03-13T12:10:21+00:00Kaela Perry/blog/author/kaela/https://www.sparkgeo.com/blog/turning-down-the-noise-with-nearblack/<table border="0">
<tbody>
<tr>
<td><img src="/static/media/uploads/Screen%20Shot%202014-03-11%20at%203.44.51%20PM.png" alt="" width="443" height="300" /></td>
<td><img src="/static/media/uploads/Screen%20Shot%202014-03-13%20at%2012.12.53%20PM.png" alt="" width="371" height="300" /></td>
</tr>
<tr>
<td>
<p style="text-align: center;">Image with nodata displayed over google basemap</p>
</td>
<td style="text-align: center;">
<p>Same image after making nodata transparent</p>
</td>
</tr>
</tbody>
</table>
<p style="text-align: left;">What&rsquo;s all the racket?</p>
<p style="text-align: left;">These little artifacts along the edge of the data look like they should be considered nodata however they show up despite setting the nodata to be transparent. Turns out these cells are slightly off-white and were likely introduced during JPEG compression. This can happen with both 0 (black) and 255 (white) nodata values.</p>
<p>Luckily GDAL has a handy tool to deal with this:&nbsp;<a href="http://www.gdal.org/nearblack.html" target="_blank">nearblack</a>. Nearblack takes all the cell values that are nearly black, or in this case white, and coverts them to true black or white.</p>
<p>Here is the code I used in the command line on the image above:</p>
<pre>nearblack -white -o output.jpg input.jpg</pre>
<p>Note: the <em>&ndash;white</em> is required because the artifacts in this particular image are white.</p>
<table border="0">
<tbody>
<tr>
<td><img src="/static/media/uploads/Screen%20Shot%202014-03-11%20at%203.54.16%20PM.png" alt="" width="300" height="202" /></td>
</tr>
<tr>
<td style="text-align: center;">and look at it now. It&rsquo;s a real beaut</td>
</tr>
</tbody>
</table>
<p style="text-align: center;"><strong><br /></strong></p>
<p>&nbsp;</p>Data Driven Pages: Making Multi-paged Map Books since 10.0 2014-03-06T14:17:51+00:00Kaela Perry/blog/author/kaela/https://www.sparkgeo.com/blog/data-driven-pages-making-multi-paged-map-books/<p>If you haven't discovered Data Driven Pages yet, boy do we have a treat for you.</p>
<h2><strong>Why are Data Driven Pages so awesome?</strong>&nbsp;</h2>
<p>Data driven pages allow you to easily create multi-paged map books from one map file that are consistent in appearance. It will definitely make your life easier if you have to break up an area into multiple maps for better viewing of data.&nbsp;For instance, 1:20k maps along a 500km corridor feature, such as a river or pipeline, or 1:10k maps covering a whole municipality.</p>
<ul>
<li><strong>Consistent appearance:</strong> Create one layout then generate pages based on an indexing layer</li>
<li><strong>Easy setup:</strong> create an index layer, set up, do a little adjusting and there you have it</li>
<li><strong>Three indexing options: </strong>a regular grid over the extent of your area of interest, a strip grid for linear features and an irregular grid based of features in a feature class</li>
<li><strong>Dynamic text:</strong> auto generates map elements such as scale bar, north arrow, page number and map name</li>
<li><strong>Rotating data frame:</strong> Using the angle attribute is useful for strip indexes because it rotates the data for a better fit the data frame.</li>
</ul>
<h2><strong>Some tips for using Data Driven Pages</strong></h2>
<p>Once you have added and symbolized all your data to the map it time to decide how you want to define your pages. Creating a <a href="http://resources.arcgis.com/en/help/main/10.2/index.html#//00700000000q000000" target="_blank">Grid index</a> using a feature class with just one polygon, such as the province of British Columbia, will create a regular grid over the extent of that polygon. If you choose a feature class with multiple polygons it will create an index for each feature, such as each province in Canada or for each sample plot in a study area. With a linear feature, such as a river or highway, you can create a <a href="http://resources.arcgis.com/en/help/main/10.2/index.html#//00700000000r000000" target="_blank">strip index</a> to create a grid of along the linear feature. You can also edit the index layer you created if it needs some tweaking. For instance, if there is map page that only contains a small portion of the area of interest it can be shifted to overlap more with another index polygon.</p>
<p>When setting up your data driven pages you can set the scale for each map page to remain the same or you may want the scale to change depending on the size of the feature indexed. Some other great things to do use when setting up your layout are dynamic map elements and a <a href="http://resources.arcgis.com/en/help/main/10.1/index.html#//00s900000028000000" target="_blank">locator map</a>. Elements such as the north arrow, page number, title, scale, and extent indicators will update depending on the map page. The location map is great for view were the data on a particular map page is located in relation to the rest of the data.</p>
<p>For more details checkout esri&rsquo;s resources on <a href="http://resources.arcgis.com/en/help/main/10.2/index.html#/What_are_Data_Driven_Pages/00s90000003m000000/" target="_blank">data driven pages</a>.</p>
<p><strong>Do you have some tips on using data driven pages? We would love to hear them. Please, leave a comment below.</strong></p>Canadian Open Data Summit 20142014-02-20T23:35:31+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/canadian-open-data-summit-2014/<p>I'm heading to beautiful Vancouver this evening to attend the Canadian Open data Summit Tomorrow! I'm very excited to be able to give a lighting talk about open data and how we can make it even better!</p>
<blockquote>
<h3>Disparate Data, Technology Fiefdoms (and 65 pictures of your cat)</h3>
<p>Up in the frozen wastes of the North, well beyond the wall (certainly beyond hope) we organized a hackathon based on the ideas of open data and civic applications.</p>
<p>&nbsp;</p>
<p>Our hardy hackathoners pulled together a number of excellent ideas but met with a constant and obtrusive barrier: that open data maybe open but with out some level of standardization its not actually very useful.</p>
<p>&nbsp;</p>
<p>Now, no one said that data had to be 'useful', and perhaps if we want the technology utopia of real open data interoperability we will need to "build it" ourselves, but it is worth noting that talking the same language as our neighbors is generally awesome. Indeed, perhaps rather than swearing fealty to our technology overlords and just pressing the "publish document to open data platform" button, we could think about the commonwealth of data. The value of any data increases wildly with density and open data should be more valuable!</p>
<p>&nbsp;</p>
<p>The cats? well you'll have to tune in for that bit.</p>
<p>&nbsp;</p>
</blockquote>
<p>You can find out more here: <a href="http://opendatasummit.ca/index/topics#cadell">http://opendatasummit.ca</a>.&nbsp;If your going to be there, track me down and say hello!</p>
<p>&nbsp;</p>Encoding GeoJSON Geometry2014-02-19T02:23:23+00:00Dustin Sampson/blog/author/dustin/https://www.sparkgeo.com/blog/encoding-geojson-geometry/<p dir="ltr"><a href="http://en.wikipedia.org/wiki/GeoJSON" target="_blank">GeoJSON</a><span> is a great format, easy to read/view/use but one thing that really stands out is the verbosity of numbers and its effects on file size. &nbsp;Yeah in rare cases it &ldquo;may&rdquo; be needed but I&rsquo;m pretty sure a length that is precise to, well less than a millimeter (example: stream length: 6849.41<span style="color: red;"><strong>9</strong></span>80435 meters) is never needed. &nbsp;And the other pink elephant in GeoJSON is the number of sig figs used for geometry!</span></p>
<pre dir="ltr">{
"type": "FeatureCollection",&nbsp;
"features": [{
"geometry": {
"type": "Polygon",
"coordinates": <span style="color: #ff69b4;">[[[-120.37273534694833, 50.67716016936108], [-120.37080246306458, 50.67707673658443], [-120.36562460360317, 50.675414824845106]...</span>
}</pre>
<p dir="ltr">I&rsquo;m sure there&rsquo;s lots of techniques to decrease the size of a GeoJSON file for transport but one of the more popular is <a href="http://en.wikipedia.org/wiki/TopoJSON" target="_blank">TopoJSON</a>. &nbsp;I guess technically not a GeoJSON format but worth mentioning. &nbsp;It reduces file size by not replicating geometry(simplifying a lot here) similar to the old <a href="http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?TopicName=Coverage_topology" target="_blank">ESRI Coverage files</a>. &nbsp;But what I wanted to see is, what if we stuck with the GeoJSON format but ran the geometry through the <a href="https://developers.google.com/maps/documentation/utilities/polylinealgorithm" target="_blank">Encoded Polyline Algorithm</a> (EPA) and then throw the results up on a map to see how much faster it would be if any.</p>
<h2 dir="ltr">How To</h2>
<p dir="ltr">Firstly, I need to convert the shapefile to geojson and to do that I'll use the&nbsp;<a href="https://github.com/sparkgeo/shp2json" target="_blank">shp2json.py</a> script.</p>
<pre dir="ltr">pixel:data dustin$ shp2json.py -e Neighbourhoods.shp Neighbourhoods.egeojson&nbsp;</pre>
<p dir="ltr">The script does a couple things, first it transforms the coordinates to <a href="http://spatialreference.org/ref/epsg/4326/" target="_blank">WGS84</a> and second by adding the -e option the EPA is applied to the geometry. &nbsp;The new geometry structure will look something like this...</p>
<pre dir="ltr">{
"type": "FeatureCollection",
"features": [{
"geometry": {
"type": "Polygon",
"coordinates": <span style="color: #41ad44;">["g{htHphu}UPaKjIk_...","}eitHrex}U@sGpDkDnGFfF...", ...] NOT SO PINK ANYMORE!!</span>
},
..
}</pre>
<p dir="ltr">Well, now I have this GeoJSON file, let&rsquo;s see if it draws! &nbsp;We&rsquo;ll employ <a href="http://leafletjs.com/" target="_blank">leaflet</a> to do this since it&rsquo;s super easy to make a map and has a great plugin called <a href="https://github.com/jieter/Leaflet.encoded" target="_blank">Leaflet.encoded</a> that can read encoded geometry. &nbsp;I&rsquo;ve also added some extra code to read regular geojson geometry so I could generate some <a href="#metrics">simple metrics</a> to see how things perform. &nbsp;All the code is available on <a href="https://github.com/sparkgeo/encoding-geojson" target="_blank">github</a> if you want to take it for a spin!</p>
<p dir="ltr"><iframe src="http://snippets.sparkgeo.com/encoding-geojson" frameborder="0" scrolling="no" width="600" height="300"></iframe></p>
<h2 id="metrics" dir="ltr">Results</h2>
<p dir="ltr">Although the drawing times were slower in my simple tests, load times more than made up for the difference. &nbsp;Below is my test results on using three different shapefiles. &nbsp;Would be interesting to see the comparison to TopoJSON ;).</p>
<div dir="ltr">
<table style="width: 600px;" border="1" cellspacing="2" cellpadding="2"><colgroup><col width="*" /><col width="*" /><col width="*" /><col width="*" /></colgroup>
<tbody>
<tr>
<td><strong>Name</strong></td>
<td>Neighbourhood.shp</td>
<td>Property.shp</td>
<td>Creek.shp</td>
</tr>
<tr>
<td><strong># of features</strong></td>
<td>26</td>
<td>31,147</td>
<td>143</td>
</tr>
<tr>
<td><strong>Shapefile (in kB) *</strong></td>
<td>806</td>
<td>63,409</td>
<td>595</td>
</tr>
<tr>
<td><strong><strong>GeoJSON GZipped (in kB)**</strong></strong></td>
<td>707.1</td>
<td>17,948</td>
<td>551</td>
</tr>
<tr style="background-color: #f5f5f5;">
<td><strong>GeoJSON (in kB)</strong></td>
<td>2,111</td>
<td>68,728</td>
<td>1,540</td>
</tr>
<tr style="background-color: #f5f5f5;">
<td><strong>Load (ms)</strong></td>
<td>3,467.5</td>
<td>16,116.1</td>
<td>1,690.8</td>
</tr>
<tr style="background-color: #f5f5f5;">
<td><strong>Draw (ms)</strong></td>
<td>187.2</td>
<td>10,367.7</td>
<td>189</td>
</tr>
<tr>
<td>
<p><strong>GeoJSON Enc. Geom.&nbsp;</strong><strong>GZipped (in kB)**</strong></p>
</td>
<td>56</td>
<td>3,221.3</td>
<td>70</td>
</tr>
<tr style="background-color: #f5f5f5;">
<td><strong>GeoJSON Enc. Geometry (in kB)</strong></td>
<td>110</td>
<td>23,750</td>
<td>124</td>
</tr>
<tr style="background-color: #f5f5f5;">
<td><strong>Load (ms)</strong></td>
<td>536.1</td>
<td>4,599</td>
<td>614.6</td>
</tr>
<tr style="background-color: #f5f5f5;">
<td><strong>Draw (ms)</strong></td>
<td>256.1</td>
<td>10,421</td>
<td>223.3</td>
</tr>
</tbody>
</table>
</div>
<p dir="ltr">* includes shp, dbf, shx, prj</p>
<p dir="ltr">** GZip command used: gzip -c Neighbourhood.geojson &gt; Neighbourhood.geojson.gz</p>
<h2 dir="ltr">Technologies Used&nbsp;</h2>
<p>Shp2json script uses:</p>
<ul>
<li><a href="https://github.com/Toblerity/Fiona" target="_blank">Fiona</a></li>
<li><a href="http://code.google.com/p/pyproj/" target="_blank">PyPROJ</a></li>
</ul>
<p>Map uses:</p>
<ul>
<li dir="ltr"><a href="http://leafletjs.com/" target="_blank">Leaflet</a></li>
<li dir="ltr"><a href="https://github.com/jieter/Leaflet.encoded" target="_blank">Leaflet.encoded</a></li>
<li dir="ltr"><a href="http://jquery.com/" target="_blank">JQuery</a></li>
</ul>Step Up Your Raster Handling Skills2014-02-07T00:04:32+00:00Kaela Perry/blog/author/kaela/https://www.sparkgeo.com/blog/step-up-your-raster-handling-skills-with-super-overlays/<p>It is not uncommon for raster data to be large and unruly. Creating tiles is one way to deal with these large datasets and speed up rendering. This post introduces a few handy tools for dealing with raster data while walking through the process of combining multiple raster files to create tiles for a super-overlay for a nice clean visual output that renders more efficiently (particularly in Google Earth). All from the command line.</p>
<p>For this example, I have a number of jpg images that have large amount of white area that was not recognized as nodata, which can be seen by adding the imagery over a satellite image. Furthermore, when creating a vrt a black nodata bounding box is created around the extent of the combined images. You can see both in the image below. We want to hide both sets of no data before we create the tiles for the super-overlay.</p>
<p><img alt="" height="290" src="/static/media/uploads/0255nodata.png" width="250"></p>
<p>What you’ll need:</p>
<ul>
<li><a class="external" href="http://www.gdal.org/" target="_blank">GDAL</a></li>
<li><a href="http://www.google.com/earth/" target="_blank">Google Earth</a></li>
<li>Your imagery</li>
</ul>
<h2><strong>Step 1: Create a vrt</strong></h2>
<pre>gdalbuildvrt -srcnodata 255 -vrtnodata 0 /AllImages.vrt /Images/*.jpg</pre>
<p>The vrt (virtual dataset) in our case is allowing us to reference a number of datasets to treat and view them as one. <em>–srcnodata 255</em> sets cells with the value of 255 (white) as the source’s nodata and <em>–vrtnodata 0 </em>sets black, created by building a vrt, as nodata as well. <em>/AllImages.vrt </em>is your output. <em>/Images/*.jpg</em> is your input. The <em>*.jpg</em> makes sure all jpg files in the images folder are added to the vrt. Now all nodata values will be transparent. Hooray.</p>
<h2><strong>Step 2: Create those tiles!</strong></h2>
<pre>gdal2tiles.py -p geodetic -k Desktop/TestImages.vrt</pre>
<p>Tiles speed up rendering by creating tiles of various resolutions and only displays tiles that are 'in view'. Lower resolution tiles are displayed when viewing data zoomed out and higher resolution tiles as you zoom in. <em>–p geodetic</em> sets the profile of the data <em>-k</em> forces the creation of the KML for the super-overlay. </p>
<h2><strong>Step 3: View your super-overlay in Google Earth</strong></h2>
<p>Open up the <em>doc.kml</em> file that was created and check out you sweet output in Google Earth. Google Earth drapes the imagery over to topography making your raster images 3D. Pretty slick.</p>
<p><img alt="" height="198" src="/static/media/uploads/Screen%20Shot%202014-02-18%20at%208.51.27%20PM.png" width="500"></p>Just listen2013-10-04T05:20:58+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/listening-is-more-important-than-talking/<center><img alt="" height="183" src="/static/media/uploads/download.jpeg" title="listening" width="275"></center>
<p> </p>
<p>In consulting you really have only one thing: the ability to <em><strong>identify</strong></em> problems. Sure, you may be a UI rockstar or have some mad statistics skills, and perhaps thats why you think your clients call you. Without being able to tease out the real problem, however you will likely not satisfy your clients demands for answers to their technology, management or structural problem. Worse, you will not be able to further support thier actual needs into the future. </p>
<p>Now, granted there are different kinds of consultants. There are those who have a packaged solution for your problem area. In this case the consultant is more of a salesperson and they will scuplt, shoe-horn or foist a prebuilt "solution" on their clients. In reality those people don't need to listen, except to echo the buzzwords they heard their prospective client say. On the onther end of the spectrum, there are also those people who are only selling their ability to undertake a specific task, a bonefide hired gun. If you fall into this catagory then listening is what you do. That is because listening will help you understand the propsective client problem, it will allow for a useful dialog to start between the client problem and your experience, that dialog will form the basis for a verbal solution and it is likley to help make your sale. As a consultant, listening defines your product, whatever it is. </p>
<p>Sparkgeo is a niche geospatial technology company. Before we talk about technology we have learned to listen to what our clients' problems really are. We don't build a sustainable relationship by quickly selling a bandaid product. However, we do build relationships if we provide real, considered advice based on actual observations discerned from a combination of empathic listening and our own real experiences. I know this works, because we have never not had repeat business from our clients. </p>
<pre>A wise old owl lived in an oak<br>The more he saw the less he spoke<br>The less he spoke the more he heard.<br>Why can't we all be like that wise old bird</pre>
<p> </p>
<p><em><strong>Empathy</strong></em> is an actual differentiator. Understanding the pain an analyst feels when confronted with 'bad' data, or that a manager feels when provided with a poorly specified (or just wrong) product provides a common plain of emotion. This common expeience separates consulting sales people from those who genuinely want to help solve a problem. Empathy is the place where your expertise and your listening comine to greater than the sum of their parts. </p>
<p>Empathy is not easily learned, but taking the time to actually listen to your client / boss / workmate / child is a huge step towards it.</p>
<p>Don't guess, or pre-empt, just listen.</p>
<p>Its that simple. </p>PGHACKS! Come build tech and save the world!2013-10-02T21:23:43+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/pghacks-come-build-tech-and-save-the-world/<pre>&gt;_ wget www.startupprincegeorge.ca<br>&gt;_ cd /pghacks<br>&gt;_ ./configure<br>build a better PG, one object at a time<br>0 10 20 30 40 50 60 70 80 90 100 - done<br>&gt;_ make<br>&gt;_ sudo pghacks.civic_apps<br>RuntimeError:<br><strong>missing local variable: YOU!</strong> <br>PG Hackathon: 9am, 19th October @ the Regional District, 155 George St. <br>1 day, fuel provided.<br><strong>Subject: Civic Apps &amp; Open Data<br></strong>GOTO <a href="http://pghacks.eventbrite.com/">http://pghacks.eventbrite.com/<br></a>for to sign up for FREE tech event</pre>Win A Sparkgeo T-Shirt2013-08-02T05:03:19+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/win-a-sparkgeo-t-shirt/<p>We are giving away a sweet Sparkgeo t-shirt!</p>
<p><img alt="sparkgeo picture" class="thumbnail" height="400" src="/static/media/uploads/awesome%20sparkgeo%20tshirt.JPG" width="234"></p>
<div>
<p>You can win it in one of four ways:</p>
<p>1) sign up for our newsletter in the footer of this page<br>- or -<br>2) like the <a href="https://www.facebook.com/pages/Sparkgeo/160163510695754" target="_blank" title="sparkgeo facebook page">sparkgeo facebook page</a> and share the <a href="https://www.facebook.com/permalink.php?story_fbid=612162608829173&amp;id=160163510695754" target="_blank" title="t-shirt competition post">t-shirt competition pos</a>t<br>- or -<br>3) follow the <a href="http://www.linkedin.com/company/sparkgeo" target="_blank" title="sparkgeo linkedin page">sparkgeo linkedin page</a> and share the most recent t-shirt update<br>- or -<br>4) tweet this blog post from the twitter link above (that's a pretty easy one...) </p>
<p>Do all 4 things and we'll give you 4 entries! We'll draw the winner by the end of August (2013)!</p>
<p>good luck guys!</p>
</div>Making the US Census Database Show Up On A Google Map 2013-07-10T00:50:13+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/making-the-us-census-database-show-up-on-a-google-map/<p>I wrote up how to <a href="/blog/building-a-us-census-tracts-postgis-database/" title="building a us census database">build a US Census Tract database</a> earlier. However, we all know that data without a way of showing it on a map, is a little sad. So I wrote up the process I used to make it all available on a Google Map. The process involves the use of PostGIS, Mapnik, Node.js and just a smidge of magic. Its a longer post, so i lumped it in with my tutorials. You can go check it out <a href="/labs/big/" title="putting a bunch of data on google maps">here</a>.</p>Dissecting a Web Mapping Application2013-06-07T18:34:23+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/dissecting-a-web-mapping-application/<p>When talking with clients there is always a turning point in my mind where their request turns from being a "Google Map" to being a "web mapping application". It is usually where data is being filtered, updataed, or when 'users' are mentioned. This a pivotal point where the project both gets more interesting, a lot more complex, and likely more expensive for the client. in this post I will take apart what a web mapping aplicatiuon actually looks like and outline some high level application considerations.</p>
<p>"I want a Google Map showing some field data points, that's easy right?" </p>
<p>"Sure"</p>
<p>"And I want my guys to be able to update them, you know like from a spreadsheet?"</p>
<p>"Oh, ok..."</p>
<p>That is the point where the "simple" Google Map turns into a mapping application. A web map is usually what you might call a "one shot deal", it contains a discrete dataset which is being displayed mostly all at once, with the exception of some cleint side magic. An application starts to happen when the map is being asked to display more and disparate data to perhaps a qualified user base. Additionally there might be the opportunity to add data to the application, or edit it. Whilst the data is still one of the primary components of the application, in general terms there is "more" happening.</p>
<h2>Frameworks</h2>
<p>I use <a href="https://www.djangoproject.com/" target="_blank" title="django link">django</a> where its possible, but I will not get into a religious war on the benefits of <a href="https://www.djangoproject.com/" target="_blank" title="django link">django</a> vs <a href="http://www.rubyonrails.org/‎" target="_blank" title="rails link">rails</a> vs <a href="http://ellislab.com/codeigniter" target="_blank" title="codeigniter">codeigniter</a> vs &lt;&lt;insert weird, esoteric framework of choice here&gt;&gt;. The point is, unless you want to write a really large amount of extra code to support users, authorization, forms, validation, administration, etc. using a framework makes good sense. Remember that your key value is likely that you can build a map, not that you can write an awesome server-side form validation technology. Using a robust, modern framework will ease your pain somewhat. if your app becomes terriblely popular, its likely that you might want to tinker with your framework to acheive a touch more performance, but frankly; that's not a problem, until its a problem. </p>
<p>I use django partly because <a href="http://www.geodjango.org" target="_blank" title="geodjango link">geoDjango</a> is baked in, but more so because python fits my rather simplistic brain better than rails. So, this is very much a personal choice. In your choice make sure that spatial objects are well looked after, ideally your framework will plug straight into PostGIS or another robust spatial data storage solution.</p>
<h2>The Story</h2>
<p>Whether this application is for field data collection of fire hydrants or for sharing the history of polar bear movement there is a story. The map will part of that story. Perhaps it is a major part, perhaps its a minor character role, either way the map is a key piece of visual storytelling and needs to be intertwined into the rest of the site in a way that supports the site's motive. In terms of the design of the appliation always consider the story and the motive of that story first. As a cartographer, web designer, web developer, or even backend devops engineer your purpose is to scupt a story for your users understand.</p>
<h2>Map Choices </h2>
<p>To add data to your application, it is likely that a user will have to have an account. It is likely they will have to log in. This seemingly common act raises a problem. If only some people can log in (ie if it is an enterprise solution, or if a user pays for an account) and if that user needs to interact with a map whilst logged inin a manner which is not available to a non-logged in user then it is possible, (likely) that you will contravene implicit licensing agreements. That means you as the developer need to pay for an enterprise license, deal with multiple mapping API /basemap solutions or use an open source mapping solution from the beginning. This is a tricky business because its likely that your client asked for a "Google Map", not just a map, so they have a certain expectation around the cartography and the interactivity. A nice feauture of the geoDjango admin is that it wires in <a href="http://openlayers.org/" target="_blank" title="openlayers link">openlayers</a> and <a href="http://www.openstreetmap.org/" target="_blank" title="open street map link">openstreetmap</a> into the admin by default, thus aleviating much of this issue. </p>
<p>In terms of basemap cartography, we are now spoilt for choice and though I would argue that Google Maps has the most robust and well documented set of functionality choices around, in terms of pure <em>cartography</em> choices are very must up for discussion now.</p>
<p>Check out these examples from around the town of Mammoth Lakes in California: </p>
<p><a href="http://www.mapbox.com/blog/2012-08-28-mapbox-streets-landcover/" target="_blank" title="Mapbox terrain link">Mapbox Terrain</a></p>
<p><img alt="" height="471" src="/static/media/uploads/mapboxter.png" title="mapbox terrain" width="700"></p>
<p><a href="https://developers.google.com/maps/documentation/javascript/maptypes#BasicMapTypes" target="_blank" title="Google terrain maptypes link">Google Maps Terrain</a></p>
<p><img alt="" height="487" src="/static/media/uploads/googleterr.png" title="google terrain" width="700"></p>
<p><a href="http://maps.stamen.com/#terrain" target="_blank" title="staman terrain data">Stamen Terrain</a></p>
<p><img alt="" height="700" src="/static/media/uploads/stamenter.jpeg" title="stamen terrain" width="700"></p>
<p>There are some differences between each example, but it is obvious that the quality of online web mapping cartograhy has increased hugely, with the consumer being the biggest winner.</p>
<h2>Think mobile first</h2>
<p>The seemingly simple act of adding the concept of users to a web map can ceratinly blow out the compelxity of any map. But the key idea to remember is that the value is being driven here by the map as a visualization platform. With that in mind, the user functionalities should be removed as far as possible from the UI. With the focus on the map, if we go to the original example its only a brief mental streach to be thinking mobile. With all the benefits that geolocation and mobile data collection can provide, not suggesting to a client or employer that any web mapping application be built with location intrinsically considered in the workflow is verging on the irrisponsible.</p>
<h2>Summary</h2>
<p>I hope this article has helped highlight some of my thoughts around web mapping application development as an extension to the idea of a simple web map.</p>
<p>The key points are:</p>
<ul>
<li>you should consider a web application framework (<a href="https://www.djangoproject.com/" target="_blank" title="django link">django</a>, <a href="http://www.rubyonrails.org/‎" target="_blank" title="rails link">rails</a> etc)</li>
<li>you should always remember your story throughout the entire UI </li>
<li>you have some decision making to do around mapping APIs (particularly in terms of protected parts of an enterprise site)</li>
<li>you should consider any application to be mobile ready </li>
</ul>
<p> </p>
<h3>At Sparkgeo, <a href="/services/geospatial" target="_blank" title="geospatial link">we do this stuff a lot</a>, maybe we can help you too?</h3>Building a US Census Tracts PostGIS Database2013-06-05T22:29:46+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/building-a-us-census-tracts-postgis-database/<p><span>This article shows how you (yes, you!) can build your very own US Census Tracts PostGIS database. I had looked over the interwebs for some time under the false assumption that someone, somewhere would have already done this, but it seems not. So this article documents my process. I am running a mac, so some things could be done faster / smarter using other technology refinements, but I have tried to keep things pretty generic for 'nix users. Windows people; you will find this process is pretty simple, I am sure you can replicate it using what you have available.</span></p>
<p>You will need:</p>
<ul>
<li>PostGIS running (locally in my case), that means you should look at the <a href="http://postgis.net/" target="_blank" title="postgis link">PostGIS requirements</a> for your machine.</li>
<li><a href="http://www.gdal.org/ogr2ogr.html" target="_blank" title="ogr2ogr link">ogr2ogr</a> available at the command line</li>
<li>familiarity with FTP</li>
</ul>
<p>And that is kinda it! This is actually quite simple, but there is a great deal of data involved, so consider that you are downloading a fair bit then processing a fair bit.</p>
<h2>Step 1 - get data</h2>
<p>The US census bureau has a great deal of data, so I will short cut your process a touch. Navigate to the directory of your choice in your terminal, then:</p>
<pre>ftp<br>&gt; open ftp.census.gov<br>user - anonymous (when asked)<br>pass - anonymous (when asked)<br>&gt; cd geo/tiger/TIGER2012/TRACT<br>&gt; ls (just to make sure)<br>&gt; mget *<br>mget t1_2012_01_tract.zip [anpqy?] a (you respond "a" here or you will have to answer this question 56 times)<br>&gt;quit</pre>
<p>That should get you the data you need, in the directory in which you reside. Now unzip it:</p>
<pre>unzip \*.zip</pre>
<h2>Step 2 - make data useful</h2>
<p>Once that is complete, you should have a directory full of shapefiles. I found I have 56 shapefiles, check your too using:</p>
<pre>ls *.shp | wc -l</pre>
<p>but as the census bureau are the custodians I could not comment on if you will get exactly the same number or not. Now, we want to load up a database. Go to postgres and set up a database. I called mine censustract2012, make sure it is built using your PostGIS template, perhaps like this:</p>
<pre>createdb -h localhost -T template_postgis censustract2012</pre>
<p>Then to start populating that database, you can try this:</p>
<pre>$ LIST="$(ls *.shp)"<br>$ for i in $LIST<br>&gt; do ogr2ogr -update -append -f PostgreSQL PG:"dbname=censustract2012 user=postgres password=postgres" $i -nlt MULTIPOLYGON25D -nln CENSUSTRACTS_EXAMPLE -progress<br>&gt; echo "done:" $i<br>&gt; done</pre>
<p>the first instance will create the actual "CENSUSTRACTS_EXAMPLE" table for you. </p>
<p>That, ladies and gentlemen, should be just about it. You can get a visual by plugging in QGIS to the database, and it should look a little like this</p>
<p><img alt="US census tracts" height="312" src="/static/media/uploads/censustracts_pic.png" width="600"></p>
<p>Hooray, now you can analyze away! </p>
<p>This blog post is brought to you with my enormous appreciation of the <a href="http://www.census.gov" target="_blank" title="US Census Bureau">US Census Bureau</a>, the <a href="http://www.postgis.net" target="_blank" title="postgis link">PostGIS</a> wizards, and the <a href="http://www.gdal.org/ogr2ogr.html" target="_blank" title="ogr gdal link">OGR / GDAL</a> wizards</p>GIS - Human Translation Services2013-06-04T00:02:07+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/gis-translation-services/<p>I remember in a previous job, I had to travel 2000kms on two flights to spend three nights in our Alberta head office to, in the end, act as a translator between a forester and a GIS resource. Although there was an ethnic difference, it was clear that the information gap was caused by the differences between those who could speak GIS and those who could not.</p>
<p>I was casued to remember this when developing Sparkgeo's new <a href="/services/gis" target="_blank">GIS Services</a> line. The real differentiator, in any technology service offering is not in the service itself (really, who can't convert a shapefile?), but in how that service is delivered and understood.</p>
<p><em>Its not a technology gap, its a <strong>communication</strong> gap.</em></p>
<p>Since 2010, I have been operating as a remote GIS / geoweb contractor providing services to US companies who need to better leverage location from Northern British Columbia. I find myself using skype a great deal, confiming things by email and often being "conference called" into group meetings. None of these situations are "easy", but when used appropriately the communication and collaboration tools available now are very effective for project management and delivery. However, its not just about the tools, there is a certain protocol to understand.</p>
<p>Here is a starter list of things to consider when running projects remotely:</p>
<p>1) Don't bug the client, unless it really is important.</p>
<p>2) If you are using Instant Messaging (IM), wait for a response before writing out a whole bunch of text on to their screen.</p>
<p>3) Confirm all actions suggested on voice/ video calls, by email; document everything.</p>
<p>4) Be available for follow ups</p>
<p>5) Ask a lot of questions, try and ask the same question in different ways to confirm your understanding</p>
<p>6) Be personable (this may seem obvious, but you would be surprised)</p>
<p>7) Don't assume a technical audience (and non-technical doesn't mean stupid, it means they are good at something else)</p>
<p>8) Be empathic, ie try to consider the client's situation</p>
<p>9) Provide plenty of short email status reports, especially if communication is hard to schedule</p>
<p>10) Don't be afaid to ask someone to repeat something </p>
<p>There is nothing easy about being a remote resource, but there are enormous benefits. Being remote trains one to think in a more client focused manner. In fact, when it comes down to it, it is often easier to call up a GIS resource by Skype than it is to go and find them on your company campus.</p>
<p>Communication is absolutely key when it comes to GIS project management.</p>
<p>You must to be fluent in both GIS and human!</p>
<p> </p>
<p>If you are interested in how Sparkgeo can provide a virtual GIS office for your organisation check out our <a href="/services/gis" target="_blank">GIS Services</a>.</p>Python Geospatial Development Published2013-06-03T23:26:30+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/python-geospatial-development-published/<p>Will Cadell was recently a technical reviewer on the latest edition of Python Geospatial Development.</p>
<p>Go check it out on amazon <a href="http://www.amazon.com/dp/178216152X/ref=as_li_ss_til?tag=winwaed-20&amp;camp=213381&amp;creative=390973&amp;linkCode=as4&amp;creativeASIN=178216152X&amp;adid=0JXWHQ13WG42WRBEYBQ9&amp;&amp;ref-refURL=http://www.geowebguru.com/news/295-python-geospatial-development-second-edition-published" target="_blank">here</a></p>
<p><a href="http://www.amazon.com/dp/178216152X/ref=as_li_ss_til?tag=winwaed-20&amp;camp=213381&amp;creative=390973&amp;linkCode=as4&amp;creativeASIN=178216152X&amp;adid=0JXWHQ13WG42WRBEYBQ9&amp;&amp;ref-refURL=http://www.geowebguru.com/news/295-python-geospatial-development-second-edition-published" target="_blank"><img alt="" class="img_right" height="300" src="http://ecx.images-amazon.com/images/I/51nHVOHOvhL._BO2,204,203,200_PIsitb-sticker-arrow-click,TopRight,35,-76_AA300_SH20_OU01_.jpg" width="300"></a></p>
<p> </p>
<p>Here's the brief:</p>
<p> </p>
<p><em>Geospatial development links your data to places on the Earth’s surface. Writing geospatial programs involves tasks such as grouping data by location, storing and analyzing large amounts of spatial information, performing complex geospatial calculations, and drawing colorful interactive maps. In order to do this well, you’ll need appropriate tools and techniques, as well as a thorough understanding of geospatial concepts such as map projections, datums and coordinate systems.</em></p>
<p><em>Python Geospatial Development, Second Edition teaches you everything you need to know about writing geospatial applications using Python. No prior knowledge of geospatial concepts, tools or techniques is required. The book guides you through the process of installing and using various toolkits, obtaining geospatial data for use in your programs, and building complete and sophisticated geospatial applications in Python.</em></p>
<p><em>Python Geospatial Development teaches you everything you need to know about writing geospatial applications using Python. No prior knowledge of geospatial concepts, tools or techniques is required. The book guides you through the process of installing and using various toolkits, obtaining geospatial data for use in your programs, and building complete and sophisticated geospatial applications in Python.</em></p>
<p><em>This book provides an overview of the major geospatial concepts, data sources and toolkits. It teaches you how to store and access spatial data using Python, how to perform a range of spatial calculations, and how to store spatial data in a database. Because maps are such an important aspect of geospatial programming, the book teaches you how to build your own "slippy map" interface within a web application, and finishes with the detailed construction of a geospatial data editor using Geodjango.</em></p>
<p><em>Whether you want to write quick utilities to solve spatial problems, or develop sophisticated web applications based around maps and geospatial data, this book includes everything you need to know.</em></p>
<p> </p>
<p>Awesome job by the author, Erik Westra!</p>ogr2ogr 3D Shapefile Import to PostGIS Tip 2013-06-03T22:24:05+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/ogr2ogr-3d-shapefile-import-to-postgis-tip/<p>A very quick tip for those scratching their heads over trying to import a 3D shapefile (which doesn't actually have any 3D data, or at least none you care about) into a flat (2D) PostGIS data set and you keep getting a geometry constratint error. </p>
<p>with a command like:</p>
<pre>ogr2ogr -update -append -f PostgreSQL PG:"dbname=postGISdatbase" my_shapefile.shp -nln testTable</pre>
<p>You migth find you get an error like: </p>
<pre>Warning 1: Geometry to be inserted is of type 3D Polygon, whereas the layer geometry type is Multi Polygon. <br>
Insertion is likely to fail<br>
ERROR 1: INSERT command for new feature failed. ERROR: new row for relation "testtable" violates check constraint "enforce_geotype_wkb_geometry"<br>
Command: INSERT INTO "testTable" ("wkb_geometry" , "zip7", "lat", "long", "attribute") VALUES ('01030000A0AD100000010000002A000000CB5ADB86511155C0315A402C9BE34040000000000000000020095C8E511155 ... ...400000000000000000'::GEOMETRY, 3000210, 33.77697326990, -84.27247267360, 31) RETURNING "ogc_fid" <br>
ERROR 1: Terminating translation prematurely after failed translation of layer my_shapefile (use -skipfailures to skip errors)</pre>
<p>The important indicator of this error is the: "check constraint "enforce_geotype_wkb_geometry" " line. This tells us that the PostGIS table constraint has been broken and there is a disconnect between the shapefile you want to import and the table you have, or want to have in PostGIS. In the case above, it might seem like you could just add the "MULTIPOLYGON" as the geometry type, but its not that easy. Often a shapfile can end up with 3D geometry unintentionally. This can be frustrating if you really don't care that much about 3D features and the shapefile "just happened" to be created that way.</p>
<p>Luckily, this is easily fixed. If you define the target geometry type, like this:</p>
<pre>ogr2ogr -update -append -f PostgreSQL PG:"dbname=postGISdatbase" my_shapefile.shp -nln testTable -nlt MULTIPOLYGON25D</pre>
<p>&amp; you might just be in luck :)</p>
<p>What's happening here? Well go check out <a href="http://www.gdal.org/ogr2ogr.html" target="_blank">the docs</a> and you can see that you can add "25D" to the name of a specific geometry to get a 2.5D version.</p>
<p> </p>SF Geo Meetup Ignite Talk2013-05-22T15:54:24+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/sf-geo-meetup-ignite-talk/<p>I recently presented at the San Francisco Geo Meetup. It was during the Google I/O conference and it was great to be able to hook up with a bunch of geo people to talk mapping and geospatial!</p>
<p>I presented on the idea of building a mapping application and how the map should be intertwied into the application's structure, rather than having a separate "launch map" link.</p>
<p><strong><em>"Launch Map" is my nemisis</em></strong></p>
<p>The slides are available here: </p>
<p><strong><a href="http://www.slideshare.net/willcadell9/ignite-21561593" target="_blank" title='Geomeetup ignite talk - Google I/O "Boxes are for Shoes"'>Geomeetup ignite talk - Google I/O "Boxes are for Shoes"</a> </strong> from <strong><a href="http://www.slideshare.net/willcadell9" target="_blank">Will Cadell</a></strong></p>
<p>Once again the Geomeetup was excellent, and I would highly reccommend it to anyone in the area or travelling through, the meetup page is here: <a href="http://www.meetup.com/geomeetup/" target="_blank">http://www.meetup.com/geomeetup/</a></p>Using Prepared Geometries in Shapely2013-05-07T23:06:33+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/using-prepared-geometries-in-shapely/<p><a href="https://github.com/sgillies/shapely" target="_blank">Shapely</a> is a great tool for the geospatial developer.</p>
<p>With Shapely it is possible to run PostGIS-like operations on non-PostGIS geometries. This can keep your python code looking smooth and consistent. In my experience it can also go rocket fast in certain applications. One of those applications is where you are using the same geometry over and over for an operation.</p>
<p>A simple use case here might be <strong>geofencing</strong>, where you want to use the same geofence (polygon) over and over for a point-in-polygon analysis. Yes, geofencing is a fancy term for seeing is a point resides within a polygon or not. </p>
<p>In this case you would want to change one geometry (the point) and see if its inside or outside of a fixed geometry (the polygon). Shapely has an awesome way to speed up this process by using a <strong>prepared</strong> geometry.</p>
<p>Another use case is to see if one line is approximatly following another, one vertex at a time. So, say you have a loop like this in django:</p>
<blockquote>
<p>#where c is a django line geometry</p>
<p>#and r is also a django line geometry</p>
<p>#buffer distance is the buffer we will use to describe the polygon we will build</p>
<p>from django.contrib.gis.geos import Point</p>
<p> </p>
<p>c_buffer = c.buffer(buffer_distance)</p>
<p>for coord in r:</p>
<p> coord_point = Point(coord)</p>
<p> if coord_point.within(c_buffer):</p>
<p> print "sweet we are in the buffer"</p>
<p> else:</p>
<p> print "unlucky, outside the buffer"</p>
</blockquote>
<p>The shapely version will look like this:</p>
<blockquote>
<p>from shapely.geometry import Point as ShapelyPoint</p>
<p>from shapely.geometry import LineString as ShapelyLinestring</p>
<p>from shapely.prepared import prep</p>
<p>#import the shapely libraries (you will have had to pip install them already of course!)</p>
<p>c_linestring = ShapelyLinestring(c)</p>
<p>#yup, we have to convert the linestring into a shapely linestring</p>
<p>c_buffer = prep(c_linestring.buffer(buffer_distance))</p>
<p>#now all we do is add in the "prep" operation</p>
<p>for coord in r: coord_point = ShapelyPoint(coord)</p>
<p> #again, we must build a shapely geometry</p>
<p> if coord_point.within(c_buffer):</p>
<p> print "sweet we are in the buffer"</p>
<p> else:</p>
<p> print "unlucky, outside the buffer"</p>
<p> #the rest of the code is exactly the same!</p>
</blockquote>
<p>As you can tell these two code snippets are not vastly different. There is some overhead is using the shapely library, but if you are going to be checking the contents of a buffer over 50 times continually (or so) it really is worth consideration.</p>
<p>Yay for <a href="https://github.com/sgillies/shapely" target="_blank">Shapely</a>! </p>
<p>Find out more about <a href="http://toblerity.github.io/shapely/manual.html#prepared-geometry-operations">prepared geometries</a></p>Sparkgeo and Google Maps2013-03-18T23:15:35+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/sparkgeo-and-google-maps/<p>At sparkgeo we build next generation web mapping applications. That means we need to consider the mapping application from conception to delivery, from database to user expeirnce. Because we usually get to choose which mapping fabric we leverage, we will usually choose <a href="https://developers.google.com/maps/" target="_blank" title="Sparkgeo uses Google Maps">Google Maps</a>.</p>
<p>We use Google Maps for almost any public facing web mapping product. There are billions of reasons (users) to support this choice, and the central driver for us is <em>familiarity</em>. Our users get to use a product they already know how to use, which means that instead of having to figure out the newest set of web GIS tools or layer layout UI, it all just makes sense. They get to focus what is being presented, rather than what is presenting.</p>
<p>At sparkgeo we went through the certified developer scheme (which is now depricated) for both the Maps API and KML, and we are now listed on the Google Earth Outreach <a href="http://www.google.ca/intl/en/earth/outreach/resources/developers.html" target="_blank" title="Google Earth Outreach developer list">developers list</a>. We have used the maps API in a bunch of products, including:</p>
<p><a href="http://www.mammothtrails.org" target="_blank" title="Google Maps: mammothtrails.org">mammothtrails.org</a></p>
<p><a href="http://www.pgpothole.com" target="_blank" title="Google Map: PG Pothole">pgpothole.com</a></p>
<p><a href="http://www.placespeak.com" target="_blank" title="Google Map; Placespeak">placespeak.com</a></p>
<p>and of course <a href="/" target="_blank" title="Google Map: Sparkgeo">sparkgeo.com</a></p>
<p>Its important to remember that often we need to augment Google Maps with other geospatial or GIS technology. We migth be using <a href="http://postgis.net/" target="_blank" title="Sparkgeo uses PostGIS">PostGIS</a> to manage data, a <a href="http://nodejs.org/" target="_blank" title="Sparkgeo uses Node.js">node.js</a> backend to build and serve map tiles, some extra <a href="http://dev.w3.org/geo/api/spec-source.html" target="_blank" title="Sparkgeo uses geolocation">HTML5 geolocation</a> support, or any number of other open, closed or mixed source technologies to build the web experience. However, when it comes to web cartography, Google has done an excellent job of describing the earth to the web using populations.</p>
<p> <img alt="Google Maps Certified Developer" class="thumbnail" height="80" src="/static/media/uploads/cert_dev.gif" title="Google Maps Certified Developer" width="80"></p>GDAL &amp; OGR clipping (hack)2013-03-18T16:03:55+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/gdal-clipping-hack/<p>Today I had to clip a raster image by 102,000 polygon features. With out doubt this task demanded automation. I decided to try my hand at python coding using the excellent <a href="http://www.gdal.org">GDAL</a> library. I had this great idea that i could do the whole thing nicely in GDAL, but i was foiled by time (or not enough of it) and I did half of the process in a pure GDAL script then I dropped to the cmd line to execute gdal_translate with a changing projwin per iteration. In case you are unaware with the gdal library comes a series of astonishingly useful utilities. I used to use the FWTools port but now I use the <a href="http://trac.osgeo.org/osgeo4w/">OSGeo4w</a> one instead. There are a number of raster processing wonder-tools in here, and gdal_translate is one of them! Back to my script, I realize now that in actual fact i am using almost exclusively OGR pieces of GDAL, confused yet?. Its simple: GDAL = rasters OGR = Vectors I start by importing:</p>
<pre>import osgeo.gdal as gdal
import osgeo.ogr as ogr
import os
</pre>
<p>Then I open up my database polygon layer and print off the extents and the number of features so I'm happy the layer is good:</p>
<pre>driver = ogr.GetDriverByName('MYSQL')
vec = "MYSQL:database,user=root,tables=my_tables"
vector_ds = driver.Open(vec, 0)
layer = vector_ds.GetLayer()
numFeatures = layer.GetFeatureCount()
print 'Feature count: ' + str(numFeatures)
extent = layer.GetExtent()
print 'Extent:', extent
print 'UL:', extent[0], extent[3]
print 'LR:', extent[1], extent[2]
</pre>
<p>Then I iterate through the features and do some nasty hackish string processing to get my coordinates working ok:</p>
<pre>feature = layer.GetNextFeature()
count = 0
while feature:
feature_id = feature.GetFieldAsString('objectid')
geometry = feature.GetGeometryRef()
coords = str(geometry)
coords1 = coords[10:-2].split(",")
co_dic = {}
count2 = 1
ulx = 0
uly = 0
lrx = 0
lry = 0
for c in coords1:
d = c.split(" ")
co_dic = {count2 : {'X': float(d[0]), "Y":float(d[1])}}
#print co_dic
co_dic
if count2 == 1:
ulx = co_dic[count2]["X"]
uly = co_dic[count2]["Y"]
lrx = co_dic[count2]["X"]
lry = co_dic[count2]["Y"]
if co_dic[count2]["X"] &lt; ulx:
ulx = co_dic[count2]["X"]
if co_dic[count2]["X"] &gt; lrx:
lrx = co_dic[count2]["X"]
if co_dic[count2]["Y"] &gt; lry:
uly = co_dic[count2]["Y"]
if co_dic[count2]["Y"] &lt; lry:
lry = co_dic[count2]["Y"]
count2 += 1
coords2 = coords1[0].split(" ")[0]
print "*** Processing Parcel " + feature_id + " ***"
print "*** ulx= " + str(ulx) + " :: uly= " + str(uly) + " :: lrx= " + str(lrx) + " :: lry= " + str(lry)
cmd = "gdal_translate -projwin " + str(ulx) + " " + str(uly) + " " + str(lrx) + " " + str(lry) + " C:/image.tif " + feature_id + ".tif"
os.popen(cmd)
feature = layer.GetNextFeature()
count = count + 1
</pre>
<p>Did you catch that third last line... yeah, there's the hack, sorry folks. The lot of the consultant is: "you do your best when you have time, but when it runs out you just make it work" Its 3 hours later and I'm on feature 89561. I love python!</p>PostGIS to PostGIS via ogr2ogr2013-03-16T01:52:56+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/postgis-to-postgis-via-org2ogr/<p><a href="http://www.gdal.org/ogr2ogr.html">OGR2OGR</a> is a hugely useful GIS utility. Today I was using it to copy data between an Amazon Web Services (<a href="http://aws.amazon.com/">AWS</a>) based PostGIS database and a local PostGIS database. This is a slightly weird use case, but I wanted to pull down some data locally for editing via Quantum GIS (<a href="http://qgis.org/">QGIS</a>).</p>
<p>Here's the command I used:</p>
<pre>ogr2ogr -f PostgreSQL --config PG_USE_COPY YES PG:"dbname='localDb' host='localhost' port='5432' user='postgres' password='postgres'" PG:"dbname='remoteDb' host='ec2.xx.xxx.xxx.xxx.compute-1.amazonaws.com' port='5432' user='yourUser' password='yourPassword'" layer1 layer2</pre>
<p>This command will pull down 2 layers (layer1 and layer2) into your 'localDB' database. You should create the database first. Note also the " --config PG_USE_COPY YES " config option which in most cases makes things go WAY faster!</p>
<p> </p>Manage your mobile workforce - workshop2013-03-09T00:59:22+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/manage-you-mobile-workforce-workshop/<p>Learn about how to manage your mobile workforce using Google Maps Coordinate.</p>
<p>Will from sparkgeo will be giving a workshop on this technology on March 19th.</p>
<div>
<p>Sign up <a href="http://tinyurl.com/ba97lol" target="_blank">here</a></p>
<p> </p>
<p>Here is some more info on the technology:</p>
<p><a href="http://www.youtube.com/watch?feature=player_embedded&amp;v=0fluKWYRC_w" target="_blank" title="Coordinate video"><img alt="coordinate video" height="300" src="/static/media/uploads/coordinate_vid.png" width="504"></a> </p>
</div>Its not about the bike2013-02-26T06:35:55+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/its-not-about-the-bike/<p>This is a quick and hopefully, obvious post.</p>
<p>At Spark<strong>geo</strong>, we build apps that have a point. Before technology, before architecture, before design, there always needs to be a point.</p>
<p>In general, a GIS portal is an excuse, and a stand alone web map is usually misinterpreted (except by GIS folks, who just want criticize the spatial reference system, in lieu of their own, favorite). A map, however can be a very useful tool. So step 1 in any geoweb process is figuring out what the point in the exercise actually is.</p>
<p>&gt; What is the message?</p>
<p>&gt; Is that message bolstered by a map?</p>
<p>&gt; How is that map to be presented to best convey the message?</p>
<p>At no point there did we talk about mapping servers, or sources of technology, or databases, or mapping fabrics, or scale, or languages, or even formats. We just asked the "what" question.</p>
<p>Here's the fisrt tip, and its a goodie: <strong>"Figure out who the real client is, and ask them what they want"</strong> </p>
<p>Hint: It might not be the person who is paying you...</p>
<p>The first question of any mapping / web project must be asking what, not how. If, as a consultant, or employee, or any sort of resource you are thinking "I could do this map because I have this technology", you have forgotten that you should be saying "I should do this map because it would be useful" or "I should do this map because its the best way of representing this feature / event / story", or best of all "I should do this map because its important that this story is told".</p>
<p>Technology comes later; it's not about the bike.</p>
<p>(That said, if you want to stretch this cliche unreasonably, it could be said that there are special 'enhancements' we can apply to make your map go extra, extra fast)</p>
<p> </p>
<p> </p>Building the map of the future2013-02-26T06:16:01+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/building-the-map-of-the-future/<p>Welcome to a series of posts on exactly how we at sparkgeo build maps. We will be discussing how we go about our process from start to finish. From our initial inspiration and design, to drawing out key processes to what technology we use and how we use it. This will be interspersed with some observations, some caveats, likely some silly exceptions, and maybe a little local flavour.</p>
<p>Lets start here: <a href="http://www.sparkgeo.com/blog/its-not-about-the-bike/">its not about the bike</a></p>Making Maps2012-11-12T16:31:14+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/making-maps/<p>Maps are used everyday. We use them for navigation; we use them for planning; we use them for visualizing and comparing places. We take them largely for granted. More so we take their content for granted.</p>
<blockquote>
<p>"Well, of course it's right...!?"</p>
</blockquote>
<p>The thing is, the world is always changing, and often maps don't keep up. It takes work to keep a map fabric up to date. Take, for instance, the Google Maps (http://www.maps.google.com) product. This is arguably one of the most commonly experienced mapping formats. Google spend a great deal of time and money on their "Ground truth" program. This is a program which actually sends people in vehicles out into the field to check features actually do exist, as well as leveraging automated checks, there is actually a human checking too.</p>
<p>Another of Google's programs is <a href="http://www.google.com/mapmaker" target="_blank">Map Maker</a>, which allows community members to update the map fabric directly. This program has been around for a few years but is still not well known. To ensure quality, communty members are asked to check the work of each other, thus 2 or 3 pairs of eyes see each change. A "mapUp' is an event were a number of mappers meet up in a single place or virtually to map as a group.</p>
<p>Prince George, to my knowledge, has not hosted one of these events before. <a href="/">Sparkgeo</a> is proposing to hold one at the Tourism Prince George office on December 6th. We are planning to focus this event on mapping some of the better known trails around the town, highlighting our recreational ameanities. Things like the greenway, Otway ski center, the LC Gunn trail etc</p>
<p>We would love <strong>YOU</strong> to <a href="http://pgmapup.eventbrite.com/" target="_blank" title="pgmapup sign up form">join us</a>!</p>StartUpDrinksPG2012-11-02T23:36:52+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/startupdrinkspg/<p>I love #startupdrinkspg.</p>
<p><img alt="" class="thumbnail pull-right" height="200" src="/static/media/uploads/logo.png" width="239"><a href="http://about.me/shaunaharper">Shauna Harper</a> and I started startupdrinksPG, on a whim 6 months ago. Its a simple concept, a meetup concerned with technology and business with drinks. There is no sponsorship, there is no pretense and there is no adgenda. Noone is selling anything and its a great opportunity to throw out an idea and instead of a nay-sayers saying "Bah!" you are amongst people who might say "Neat!". Instead of people who look at the problem you find yourself amongst people who see the possibility.</p>
<p>Those are "entrepreneurs". Those dangerous and subversive characters. Agents of change. People willing to let an idea consume them. People willing to stake it all on a "maybe". That's a touch dramatic, but hey, you're allowed to be passionate about things when you have a long and exotic job title like "entrepreneur". Also you can come to startupdrinksPG. </p>
<p>So, come along. That idea you had, maybe, just maybe, you're on to something...</p>
<p><a href="http://startupdrinkspg.eventbrite.com/" target="_blank" title="startup drinks">sign up here </a></p>Change Makers Competition!2012-11-02T23:09:30+00:00Will Cadell/blog/author/will/https://www.sparkgeo.com/blog/change-makers-competition/<p>Landsongs, a sparkgeo product which helps communities capture and share their stories, has been shortlisted in in the <a href="http://www.bcideas.ca/" target="_blank" title="BCIdeas">www.BCIdeas.ca </a>Changemakers competition.</p>
<p> <img alt="" class="thumbnail pull-right" height="275" src="/static/media/uploads/landsongs.png" width="275"></p>
<p>Landsongs helps communities capture, store and share their stories. It does this by leveraging next generation web and mapping technology. Presently, landsongs can be delivered on a community by communnity basis, but we want to build this out into a model that every interested commuity could easily access. </p>
<p>We have been chosen in the top 11 from 466 entries for the people's choice award!</p>
<p>We are honoured to be amongst some really excellent ideas! We hope you will take the time to vote for us and 2 other ideas as well.</p>
<p>You can vote at the <a href="http://www.bcideas.ca" target="_blank">BC Ideas site</a> or <a href="/" target="_blank" title="sparkgeo">here</a> at sparkgeo.</p>