Tuesday, 6 December 2011

A S.O.L.I.D data access layer

A while back I had reason to look at one of my assemblies in my “toolbox” which wrapped the System.Data.SqlClient assembly and as often happens when you look at old code you cringe and wonder what you were thinking when you wrote it, so I decided to refactor the code using S.O.L.I.D principles to make the code easier to understand, test, maintain and extend and so ADO.NET DAL was born.

For those of you who follow on me on Twitter you’ll know I spent a long time trying to ensure that the library had 100% code coverage but unfortunately I never made it, one method still eludes me but 99% coverage is pretty good .

A data access layer? really?

In this age of ORM’s, micro-ORM’s and Simple.Data are people still using ADO.Net directly? are people writing data access layers which don’t return a fully populated POCO? I can safely say yes to both of those questions having seen code with numerous native ADO.NET calls and various blog and tweets from people that don’t like or want to use an ORM.

For some people ORM’s are seen as an anti-pattern which actually hinder development as a project becomes more complex, for other people they may be dealing with legacy software that uses ADO.NET extensively and this use of native ADO.NET makes the code hard to test and often the amount of effort required to refactor it is so large it is just left as it is.

Why should I use it?

If you are working on legacy software which has ADO.NET code scattered throughout the codebase you’re working on this library should make it easy for you to refactor the code without taking a lot of time, allowing you to write tests for it, and making the code a lot cleaner in the process.

If you don’t want to use an ORM then this library has wrapped the ADO.NET functionality in a nice package that you can just use meaning you don’t have to worry about connection handling, transactions or creation of parameters.

What does it do for me?

As mentioned above the library is designed to deal with connection handling, all that you need to do is to provide the Sql, or name of the stored procedure, you want to execute to the appropriate method and the library create the command object, configure it and execute it for you.

The intention behind the code is to simplify the use of ADO.NET by wrapping the boiler plate code and just exposing the core functionality through a facade meaning you can focus on the what you want your code to do rather than the how.

If you have native ADO.NET in your classes you can simply replace blocks of code with calls to the appropriate methods, this means you need make only the minimum of changes to the overall structure of your code without worrying about side effects.

The library is test friendly making it easy to mock/stub/fake the interaction between your code and the database and by ensuring that the code can be tested it is also easy to inject into any classes that may need it using your favourite Inversion of Control container.

To further help with testing the public API uses the System.Data.Common namespace which is designed to work with various data providers and all support generic interfaces that can again be mocked/stubbed/faked.

Show me the code….

Here is an example of some pretty standard ADO.NET code that you may find in many code bases:

you can accomplish the same in a single line…ok so I cheated a little and missed out the creation of the dataAccess object itself but that is just:

1: IDataAccess dataAccess = new DataAccess(connectionString);

and if you are using dependency injection you’d get your IoC container to do this for you (you are using IoC aren’t you?) which means that you won’t have it scattered throughout the code base.

If you look at the code you’ll see that the main IDataAccess interface has a ParameterFactory object that you use for creation of any parameters that you may need, and because the method calls all take a param array of DbParameters you can simply add more and more in the method call like this:

The other feature that you may want to be aware of is that each method has an overload that allows you access to the actual DBCommand that was executed so that if you need to get to the return value, return parameters or anything else on the command you simply provide an out parameter to populate:

Future plans

At the moment the library only supports Sql Server as that was the focus of the original library that I refactored, however, I’m hoping to extend the number of types of database the library supports going forward with the first additional type of database being Sql Compact.

This seemingly simple change will offer plenty of challenges as I would like to keep the public API the same meaning I have to work out which type of database you want under the covers, quite how I’m going to do that I haven’t decided yet.

Feedback please

I’d love to hear your thoughts on this, is it something you like? is it something you think you could use? would I be wasting my time extending this further given alternatives that are out there?

Please leave me a comment or contact me about it and feel free to fork the code and have a play.