latest posts

Nov 23, 2014

For as long as I can remember since C# became my language of choice, I've been yearning for the cleanest and most efficient way of getting data from a database (XML, SQL Server etc.) into my C# application whether it was in an ASP.NET,
WinForms or Windows Service. For the most part I was content with Typed DataSets back in the .NET 2.0 days. Creating an XML file by hand with all of the different properties, running the command line xsd tool on the
XML file and having it generate a C# class I could use in my WinForms application. This had problems later on down the road when Portable Class Libraries (PCLs) became available, eliminating code duplication, but lack to this day Typed
Dataset support, not to mention ClientWeb Service interactions have changed greatly since then switching to a mostly JSON REST infrastructure.

Ideally I wish there was a clean way to define a Class inside a Portable Class Library (made available to Android, iOS, Windows Store, Windows Phone and .NET 4.5+) and have Entity Framework map to those entities. Does this exist today?
To the best of my research it does not without a lot of work upfront (more on this later). So where does this lead us to today?

For most projects you probably see a simple Entity Framework Model mapped to SQL Tables, Views and possibly Stored Procedures living inside the ASP.NET WebForms or MVC solution. While this might have been acceptable 4 or 5 years ago,
in the multi-platform world we live in today you can't assume you'd only have a Web only client. As I've stated on numerous times over the years, investing a little bit more time in the initial development of a project to plan ahead
for multiple platforms is key today. Some of you might be saying "Well I know it'll only be a Web project, the client said they'd never want a native mobile app when we asked them" Now ask yourself how many times a client came back
and asked for what they said they never wanted when you or the PM asked. Planning ahead not only saves time later, but delivers a better product for your client (internal or external), bringing a better value add to your
service.

Going back to the problem at hand: a highly coupled Entity Framework model to the rest of the ASP.NET application. Below are some possible solutions (not all of them, but in my opinion the most common:

1. Easy way out

Some projects I have worked on had the Entity Framework model (and associated code) in their own Library and then the ASP.NET (WebForms, MVC, WebAPI or WCF service) project simply reference the library. While this is better in that if
you migrate from the existing project or want a completely different project to reference the same model (a Windows Service perhaps), then you don't have to invest the time in moving all of the code and updating all of the namespaces
in both the new library and the project(s) referencing it. However you still have the tight coupling between your project and the Entity Framework model.

2. POCO via Generators

Another possible solution is to use the POCO (Plain Old CLR Object) approach with Entity Framework. There are a number of generators (Entity Framework Power Tools or the EntityFramework Reverse POCO Generator), both have dependencies in
the clients that reference the POCO Classes with Entity Framework, thus negating the idea you'd be able to have 1 set of classes for both your clients of your platform and Entity Framework.

3. POCO with Reflection

Yet another possible solution is to create a custom attribute and via reflection map a class object defined in your PCL. This approach has the cleaness of having the following possible POCO Class with custom attributes:

As one can see this is extremely clean on the implementation side, albiet a bit tideous on the backend side. Imagining a recent project at work with hundreds of tables this could be extremely daunting to maintain, let alone implement
in an sizeable existing project.

Unsatisfied with these options I was curious how a traditional approach would compare performance wise to Option 4 above, given that it satisfied the requirement of a single class residing in a PCL. For comparison assuming the table is
defined as such:

Benchmark Results

Below are the results running the test 3 times for each size data set. For those that are interested I was running the benchmark on my AMD FX-8350 (8x4ghz), VS 2013 Update 4 and SQL Server 2014 with the database installed on a Samsung
840 Pro SSD.

The results weren't too surprising to me figuring the "traditional" approach would be a factor or so slower than the DbContext approach, but I didn't think about it from the standpoint of larger datasets being considerably slower.
Granted we're talking fractions of a second, but multiple that by hundreds of thousands (or millions) of concurrent connections it is considerable.

Closing thoughts

Having spent a couple hours deep diving into the newer features of Entity Framework 6.x hoping that the golden solution would exist today I'm having to go back to an idea I had several months ago, jcENTITYFRAMEWORKin which at compile time the associations would be created mapping the existing classes to the equivalent Tables, Views and Stored Procedures. In addition to utilizing the lower level ADO.NET calls
instead of simply making EntityFramework calls. Where I left it off I was still hitting an ASP.NET performance hit on smaller data sets (though on larger data sets my implementation was several factors better). More to come on that
project in the coming weeks/months as Database I/O with C# is definitely not going away for anyone and there is clearly a problem with the possible solutions today. At the very least coming up with a clean and portable way to allow
existing POCOs to be mapped to SQL Tables and Stored Procedures is a new priority for myself.

For those interested in the ASP.NET MVC and PCL code used in benchmarking the two approaches, you can download it here. Not a definitive test, but real
world enough. If for some reason I missed a possible approach, please comment below, I am very eager to see a solution.