Friday, 22 April 2011

This allows any C# project (WPF, Silverlight, ASP.NET, Winforms, Class Library, etc) to make use of external functionality dynamically at runtime without the need for a hard coded reference. This is made possible through the use of interfaces and MEF with it’s Import and Export attributes and only takes a few lines of code.

MEF is part of the .NET Framework v4.0 and is being used in future versions of Visual Studio to provide their plug-in model.

Architecture

In our demo the host application, in this instance a console app, will load two plug-ins to provide CalculationServices. Both will implement a Calculate method accepting two integers and returning an integer. The first CalculationService will add the two numbers together and the second CalculationService plug-in will multiple the numbers together.

Coding the Demo

First create a Console application called MefSimpleDemo.exe then add three class libraries to the solution:

SharedContracts (for our interfaces)

CalculatorService1 (plug-in)

CalculatorService2 (plug-in).

The Console Application and SharedContracts projects will need to reference MEF and the our Interfaces so right clcik and add the two references for each project.

MEF is stored within the System.ComponentModel.Composition framework (System.ComponentModel.Composition.DLL).

In our example the plug-ins will be loaded from a known directory location. In production you would load this location from a config but for this demo we’ll be hardcoding the location (@”../plugins” folder) in the host and building the output of the plug-ins to the same folder. To alter the output right click on view properties on both plugins click the “Build” tab and update the “Output Path”:

SharedContracts.DLL Class Library

Remove the default Class1.cs file, add a file called ICalculationService.cs and paste in the following code:

NOTE: Notice the (MEF) InheritedExport attribute on the interface, this allows MEF to discover the class within the plugin.

CalculationService1.DLL Class Library

Remove the default Class1.cs file, add a file called Addition.cs and paste in the following code:

CalculationService2.DLL Class Library

Remove the default Class1.cs file, add a file called Multiply.cs and paste in the following code:

MefSimpleDemo.EXE Console Application

Add a new file called PluginRepository.cs and paste in the following code:

Finally replace the text in the Program.cs file with the following code:

Your solution should now look like this, notice I’ve removed all redundant references from each of the projects:

You are now ready to run the application, press F5 and check out your results:

With Microsoft now firmly pushing Entity Framework (currently at version 4) as their preferred data access solution, it’s refreshing to see a couple of new players on the scene albeit perhaps not in direct competition.

Dapper (Dynamic Query Mapper) and PetaPoco are Mirco ORMs. An ORM is an Object Relational Mapper which sit between your business logic and database to abstract away the direct link between the two. Micro ORMs are a simplier and more efficient option to larger ORM frameworks such as EF4 and NHibernate.

In this demo (full source code downloadable from the links at the end of this article) I’ll be giving examples of how to use each of the frameworks against a SQL Server database for your CRUD (create/insert, read/select, update, delete) commands. I’m using the tempdb database so there' is no need to run any database scripts.

It’s important to note that for performance you still get a marginal benefit creating a hand coded DAL with SQLConnection, SQLCommand and all those SQLParameter objects like this:

But to hand code your own entire DAL is pretty time consuming and hard to maintain particularly when you’ve got commands with plenty of parameters. Now you could use T4 to generate your DAL based on your database schema and that’s probably still your best bet if performance is everything to your application. Or, for a minimal perf overhead (most LOB apps wouldn’t notice) you could use one of our new Micro ORMs!

To carry out the same task as the hand coded DAL above, Dapper requires only two lines of code:

Note: This is using the (customer) typed Query, you can also use a dynamic version that omits the generic parameter of Customer to return a Dynamic Object. Dynamic Objects were introduced in .NET Framework 4.

Note: This example is using the default execution options, you can also tweak the options to boost performance shown later in this blog.

You can see from a RAD point of view Micro ORMs really come into their own compared to hand coding your DAL. Since the start of .NET plenty of frameworks have been created to generate your Sql commands and parameters (based on your DAL method attributes or parameters) but these Micro ORM frameworks are special.

Firstly, other than one file for each framework no other code is required to pollute your solution. No attributes clogging up your DAL or DTOs or Business Layers. But now for the really special bit, both frameworks are actually generating code dynamically using Dynamic Methods and ILGenerators which is pretty cool and inspiring stuff. The source code for both frameworks are included in the source code download to this blog post and I encourage you to take a look – kudos to the guys that put these frameworks together!

In terms of performance, the full Dapper project (link also at the end of the post) includes a test project that runs a simple select statement 500 times and compares performance against various data access solutions, the results of which can be seen here:

Note: Dapper is named as Mapper Query when using the typed/generic read method and Dynamic Mapper Query when using the non generic method.

2nd Note: This code was produced on a travel netbook (I’m writing this blog post whilst stuck in Delhi airport for the night being eaten alive by mosquitos :) ). I would expect the figures to be much faster on a newer PC or laptop.

Exporting Xml using TSQL

We’ll be using xp_cmdshell and BCP to export the data so you’ll need to configure your Sql Server instance to allow you to run the command. You can do that by running the following script:

To start with I’ll show the data we plan to export in a standard denormalised format from a Sql query:

Which returns us the data in the following format (for now I’ll only show the top of the results):

To return the same data in Xml that is nicely formatted and nested we can run the following TSQL statement:

Which gives us our Xml representation (again, I’ll only show the top of the results):

To export our data in Xml format we can run the following script:

Notice in the BCP statement we are using the –w parameter to indicate we want the output in Unicode. In SQL 2008 R2 to boost performance we can change this to use the Native SQL format but more on that later.

Note: You’ll need to ensure you’re Sql Server instance is configured correctly with the appropriate permissions assigned set to run the xp_cmdshell and BCP statement, with access to the filesystem.

You can open our new C:\data.xml file in Notepad or IE and see we have nicely formatted xml file:

Importing Xml using TSQL

To import the data back into SQL you can use the following statement:

Which results in:

SQL 2008 R2 Native Format Support

For improved performance in SQL 2008 R2 you can alter the scripts above to utilise Native format support.

For export if you change the BCP statement in our export script above to use –N instead of –w:

You won’t be able to view the data in a text editor as it’s now in a SQL encoding but you’ll get much improved performance.

For import you can change the script to reference the data file type of ‘widenative’:

More information can be found about Using Native Format for Import and Export here:

Sunday, 17 April 2011

Learn how the Knockout library builds on advanced jQuery and JavaScript techniques to make even the most complex data-filled HTML forms a breeze. We’ll see jQuery, jQuery templating, JSON and live data banding applied wto the MVVM pattern with Knockout, combined with ASP.NET to produce results that need to be seen to believed.

Highly recommended, within 5 minutes you’ll understand the power of Knockout.JS:

Gibraltar monitors .Net application to record errors, events, trace statements and performance metrics then lets you effortlessly analyse the results.

There are three parts to Gibraltar:

The Agent is a DLL (Gibraltar.Agent.Dll) referenced by your .Net application that records the errors, events and metrics of your app. It’s really simple to add to your project and can be configured automatically via a wizard or manually by editing an Xml file or programmatically via the Gibraltar API.

The Hub serves as a delivery mechanism for transporting the results from each Agent to the Analyst.

The Analyst is a GUI application that lets developers view and analyse results through drilldowns, filtering and charts.

You can see here that the Analyst shows you the source code that has generated the message.

I highly recommend the following video from Gibraltar Software that gives and excellent overview of Gibraltar. The video also gives an example of teaming the software up with PostSharp, the popular AOP (Aspect Oriented Programming) framework:

Using PostSharp with Gibraltar

To download trial versions of Gibraltar and PostSharp please visit the official websites: