Business Intelligence Developer (MCSE)

If you’ve been following the recent developments for SQL Server, you’ve probably noticed Microsoft is not only bringing more and more features onto the platform but also increasingly encouraging developers to extend the base functionalities through integration with other programming languages including those outside the core .Net languages of C# and VB.Net.

In SQL2016, it was all R and data science and now with SQL 2017, it’s Python, something more towards general data operations. According to Microsoft, the plan is to bring more languages onto the platform and at a faster rate which I think is a great idea.

But even before all this and as early as SQL 2005, we had Common Language Runtime (CLR) Integration. This earlier approach to integration was completely different; with R and Python, their scripts were wrapped inside TSQL and executed like and with the TSQL similar to how dynamic SQL works. With CLR, you’re actually creating a DLL (basically a .Net application, just without the .exe) and importing this (and any dependent DLLs, if any) into SQL Server and running this via TSQL. So what you do is write the C#/VB code, compile it and import that DLL into SQL Server and executing it via TSQL. What this gives you is the power of .Net in your TSQL code. TSQL works great with sets but basically useless outside a database. C#/VB.Net is general object oriented and designed to do much, much more.

To give you a quick taste of how things work, I’ll use a simple example to illustrate the steps for implementation, followed by something a bit more advance to show what’s possible. As this is intended to show what is possible, it should be clear we’re not going to cover all the important issues you’ll need to consider. There are many, further reading is well advised.

The objects you can create from .Net (I’ll use C#) includes the list below. Once created, they are exactly like their TSQL equivalents as far as usage is concerned.

CLR Stored Procedures (Procs)

CLR Triggers

CLR Functions

CLR Aggregates

For the simple demo I’ll create a Stored Proc, but even something simple (and handy) like this, zipping some files was only possible as recently as .Net Framework 4. I’ll then create a CLR Function for the more advanced one-it calls a Tableau REST API. I’m using this API simply because I got the codes ready and a matter of reusing it from another project, you could well call any other API. I’ll use something different/more interesting next time 🙂

Many Tools, Many Approaches

Firstly, there are many ways to do this including different software and templates. The Microsoft documents for example doesn’t even use Visual Studio, instead just a text editor for the creation of the DLL. I prefer using the SQL Server Database Project template in Visual Studio as it has more features though a deeper knowledge of SQL integration is required than the other templates/approaches. More importantly, you can debug your C# codes this way. Irrespective of the approach/tools, it’s the same three steps.

Step 1: Create the DLL (Assembly)

I’m using C# in Visual Studio to create the DLL. The two methods below corresponds to the two procs we’ll be creating (hard coded here for demo-you would obviously use parameters with a try catch block in real life). Once codes is ready, build the project and ensure the DLL is created in the bin folder. The first method/proc will zip all files in the directory specified, the second to unzip the same files. Note the System.IO.Compression reference-that’s not a supported library and so we’ll also need to import that DLL later together with what we’re building here.

Step 2: Import/Register the DLL(s) via SQL

You then import DLLs into SQL Server via SQL in either SSMS (SSDT) as shown below or Visual Studio. Visual Studio is probably quicker but it’s unlikely how it’ll be deployed in Production hence I prefer using SSMS/SSDT for this step.

Things to note with regards to the line numbers below:

2: CLR is disabled on all DB by default, to enable CLR run this line. Only needs to be run once for each DB.

5: Only needed as we’re granting the DB to run unsafe assemblies for this example. It’s not as bad as it sounds, unsafe simply means unsupported within SQL Server and also as we’re granting extra privileges to the DLL (to access network folders for zipping).

9 & 15: For this Zip example, we’re referencing the System.IO.Compression DLL, therefore these two DLLs also needs to be imported. These DLLs are actually part of .Net and created by Microsoft (on the point of ‘unsafe’).

20: This is to import our clr_test.dll we’ve created from Step 1 before.

Once you’ve run the script for Create Assembly above, you’ll be able to see the DLL (Assemblies) within SSMS/SSDT as seen below-Assemblies folder.

Step 3: Create the SQL Objects

With DLL created and imported from above, final thing to do is create the SQL Objects from the DLLs. Here, we’re creating two stored procedures based on the DLL we created using ‘SQL.

Once you’ve created the stored procs, you’ll see the CLR Procs like any other stored procs in the Stored Procedure folder of SSMS/SSDT.

Step 4: Your done-have fun with your new objects in SQL Server

With the CLR procs created, it’s ready for use. Execution is exactly the same as your normal TSQL procs. If you run “Exec spZip” now in SSMS, you’ll simply zip all files in the directory we hard coded earlier in C#.

Using Table Valued Function with Tableau REST API (HTTP)

I’ve decided to include this example here as this one is slightly more advance. I won’t go into details here as the steps are the same. There are no input parameters for the function here as it’s not needed for the demo (again you’ll parameterize it in real life) but you should notice the output parameter in the form of a table. This will be displayed as a grid in SSMS when the function is called.

As before, create/update the DLL in C#.

Import DLL and Create the Functions. Note how we didn’t have to import any other DLLs like the first example as we’re not referencing other libraries here other than the supported ones.

This is the request to Tableau Server (hard coded for demo). If you want to see what else is available from Tableau, visit the OnlineHelp.

This is the response from Tableau Server for the call above.

And finally the output- this is what you get when you call the function in SSMS or any client app. Nothing terribly exciting but we did manage to link and integrate two different systems in just three simple steps- Tableau Server metadata/information is now available to anyone via a simple SQL function in SQL Server. Tableau server may not be your thing but the same approach can be used for any APIs.

Debugging

And of course debugging. Things are so much easier nowadays-I’m using Visual Studio 2017. Just remember to run as admin in VS. Debugging is not something that’s available with the current R/SQL or Python/SQL integration.

Right click and allow SQL/CLR Debugging at the server

Right click Object (Function) and Debug

Step through your SQL/C# code, line by line.

There you have it.

CLR is not new but what is different now and why it deserves a second look are the advances and other innovations outside SQL Server. We’ve only saw two examples here but even without much effort, I can think of at least a dozen or two other common task that can be automated and standardised for your typical BI/Analytics team. CLR was after-all designed to replace xp_cmdshell and as many have since noted, an excellent facilitator to the ever increasing number and types of APIs.

Like most modern servers nowadays, Power BI has a fairly good set of web APIs that developers can leverage to extend functionalities of the product and further automate many processes. The developer’s section of the documentation provides an easy to follow set of instructions on how to get started and showcases some very useful feature … More Power BI REST API

Came across a few little extensions in Visual Studio Code marketplace recently that has and will further save me considerable time when I need to work with JSON. … More Serializing and Deserializing JSON

Whether you're a manager, developer or DBA, SQL Auditing is a powerful tool that can help you understand what is happening within your platform and its underlying data. Auditing on SQL Server allows you to monitor your server from two different aspects: at the server level and the database level. … More Auditing Database Servers with SQL Server-A Primer

In my last blog on partition tables, I talked about the benefits from an archiving and management perspective. In this one, I'll go over the query performance that can be gained. If partitioning is something new to you, it's worth thinking it as dividing large tables into smaller ones … More Partitioned for Performance

If you do any development beyond the most basic on the .Net Framework, understanding and resolving DLL dependencies is a necessity and a fact of life and without the right tools to guide you, this can be a very time consuming and frustrating exercise. … More Untangle and Understand DLL Dependencies with NDepend

It’s fairly common knowledge amongst database professionals partitioned tables can help considerably with performance in various situations but my guess is not many would have considered the enormous benefits when it comes to archiving, a necessity when it comes to massive tables. … More Horizontal Partitioning for Big Tables

Despite its reputation, I'm still a massive fan of linked servers. Microsoft developed it for a reason and it's ongoing presence on the platform since day one says it all. No one told you to pull back a gazillion rows on an inner join with a local table. So, when I saw it's now possible ... … More Linked Servers for Azure’s NoSQL Cosmos DB