Whilst working on an Azure Data Lake project, a requirement hit the backlog that could be easily solved with a Geographical Information System (GIS) or even SQL Server - Spatial data type support was introduced into SQL Server 2008. However, Azure Data Lake Analytics (ADLA) does not natively support spatial data analytics so we'll have to extract the data into another service right? Wrong ? :) Due to the extensibility of Azure Data Lake Analytics, we can enhance it to do practically anything. In fact, we can lean on existing components and enhance the service without having to develop the enhancement itself. This blog is a quick run through demonstrating how to enhance ADLA such that it will support Spatial analytics and meet our project requirement. Problem For simplicity I've trivialised the problem. Here's the requirement: Indicate which Bus Stops are within 1.5 km of Southwark Tube Station. To support this requirement, we have two datasets: A list of all the Bus Stops in London, including their Geo location (circa 20k records) The Geo location record of Southwark Tube Station (a single record !) In fact, the location of the tube station is pretty accurate and is geo located to the entrance pavement outside the tube station: This would be an easy problem for a GIS to solve. You would specify the central point i.e. our Southwark Tube station marker and draw a circle, or buffer, with a radius 1.5 km around it and select all bus stops that fall within or intersect with that circle. This spatial analysis is easy for these systems as that's essentially what they are built to do. SQL Server 2008 introduced the Spatial Data Type, this allowed spatial style analysis to be performed on geo data using T-SQL in conjunction with the supplied Geometry and Geography data types. More info on those can be found here So, how can we solve our problem in ADLA, without a GIS and without having to export the data to SQL Server?? Solution You can register existing assemblies with ADLA. It so happens that the SQL Server Data Types and Spatial assemblies are nicely packaged up and can be used directly within ADLA itself - think about that, it's pretty awesome ! Caveat: At the time of writing we have no idea of the licence implications. It will be up to you to ensure you are not in breach :) Those assemblies can be downloaded from here. You only need to download and install the following file: ENU\x64\SQLSysClrTypes.msi This installs two key assemblies, which you'll need to grab and upload to your Data Lake Store: C:\Program Files (x86)\Microsoft SQL Server\130\SDK\Assemblies\Microsoft.SqlServer.Types.dll C:\Windows\System32\SqlServerSpatial130.dll Once they have been uploaded to your Data Lake Store, you need to register those assemblies with ADLA. DECLARE @ASSEMBLY_PATH string = "/5.UTILITY/USQL-Extend/SQL-Server/";
DECLARE @TYPES_ASM string = @ASSEMBLY_PATH+"Microsoft.SqlServer.Types.dll";
DECLARE @SPATIAL_ASM string = @ASSEMBLY_PATH+"SqlServerSpatial130.dll";
CREATE DATABASE IF NOT EXISTS SQLServerExtensions;
USE DATABASE SQLServerExtensions;
DROP ASSEMBLY IF EXISTS SqlSpatial;
CREATE ASSEMBLY SqlSpatial
FROM @TYPES_ASM
WITH ADDITIONAL_FILES =
(
@SPATIAL_ASM
);
Following registration of the assemblies, we can see the registration loaded in the ADLA Catalog database we created:
We are now ready to use this U-SQL enhancement in our U-SQL Query - let's go right ahead and solve our problem in one U-SQL Script.
// Reference the assemblies we require in our script.
// System.Xml we get for free as a System Assembly so we didn't need to register that and our SQLServerExtensions.SqlSpatial assembly
REFERENCE SYSTEM ASSEMBLY [System.Xml];
REFERENCE ASSEMBLY SQLServerExtensions.SqlSpatial;
// Once the appropriate assemblies are registered, we can alias them using the USING keyword.
USING Geometry = Microsoft.SqlServer.Types.SqlGeometry;
USING Geography = Microsoft.SqlServer.Types.SqlGeography;
USING SqlChars = System.Data.SqlTypes.SqlChars;
// First create the centralised point.
// In this case it's the pavement outside the entrance of Southwark Tube Station, London.
// Format is Longitude, Latitude and then SRID.
// NB: It's Longitude then Latitude, that's the opposite way to what you might expect..
DECLARE @southwarkTube Geography = Geography.Point(-0.104777,51.503829,4326);
// Next we extract our entire London bus stop data set from the file.
// There's about 20k of them.
@busStopInput =
EXTRACT
[StopCode] string,
[StopName] string,
[Latitude] double?,
[Longitude] double?
FROM @"/1.RAW/OpenData/Transport/bus-stops-narrow-full-london.csv"
USING Extractors.Csv(skipFirstNRows:1,silent:true);
// This is effectively the transform step and where the magic happens
// Very similar syntax to what you would do in T-SQL.
// We are returning all the bus stops that fall within 1500m of Southwark Tube
// Essentially we return all stops that intersect with a 1500m buffer around the central tube point
@closeBusStops=
SELECT
*
FROM
@busStopInput
WHERE
@southwarkTube.STBuffer(1500).STIntersects(Geography.Point((double)@busStopInput.Longitude,(double)@busStopInput.Latitude,4326)).ToString()=="True";
// The results are written out to a csv file.
OUTPUT
@closeBusStops TO "/4.LABORATORY/Desks/Sach/spatial-closebusstops.csv"
USING Outputters.Csv(outputHeader: true);
The query outputs a list of bus stops that are within the specified Spatial distance from Southwark Tube Station. If we have a look at all the bus stops (in red) and overlay all the 'close' bus stops (in green), we can see the results:
Pretty neat.
Azure Data Lake Analytics does not natively support spatial data analytics but by simply utilising the assemblies that ship with SQL Server, we can extend the capability of U-SQL to provide that functionality or practically any functionality we desire.

In this blog I’m going to show one of the advantages of linking Data Lakes Analytics with Machine Learning. We’ll be uploading a series of images to the Data Lake, we will then run a USQL script that will detect objects in the images and create relative tags in a text file. First of all you need an instance of Data Lake Store and one of Data Lake Analytics, once these are up and running we need to enable Python/R/Cognitive in your Data Lake Analytics instance (here is a blog to help you out on this). First things first, we need to put an image in our Data Lake Store, following Azure Data Lake best practices I put the images in my laboratory subfolder. Once our images are in place we need to create a script, in your Data Lake analytics instance click on New Job This will open a new blade with an empty script, let’s give our new Job a name “ImageTagging”. In order to use Image tagging we need to import the relevant ASSEMBLIES:REFERENCE ASSEMBLY ImageCommon;REFERENCE ASSEMBLY ImageTagging;
Next we need to extract information (location, filename etc.) on the image file(s) we want to analyse, in this case we’ll process all images in the specified folder.@images= EXTRACT FileName string, ImgData byte[] FROM @"/Laboratory/Desks/CSbrescia/ImageTagging/{FileName:*}.jpg" USING new Cognition.Vision.ImageExtractor();
The following step is where the magic happens, the script analyses all the images located in the folder indicated before, it detects all objects present in each image and create tags; here is the structure of this “variable”:
Image name
Number of tagged objects detected
A string with all the tags @TaggedObjects= PROCESS @images PRODUCE FileName, NumObjects int, Tags string READONLY FileName USING new Cognition.Vision.ImageTagger();
Now we can write our variable with all the tags to an output fileOUTPUT @TaggedObjectsTO "/Laboratory/Desks/CSbrescia/ImageTagging/ImageTags.tsv" USING Outputters.Tsv();
Here are the images I used in this example
And here is the list of objects detected
In conclusion, we have created a pretty handy tool for automatic image tagging using Data Lake with very little knowledge required on the background processes involved.
To be noted that there seems to be an image size limit, i had to resize all images to about 500 kb.