We're in the process of converting an old ASP.NET WebForms application to MVC. For this particular project, we are accessing an old database, one that doesn't really play well with Entity Framework or other ORMs, so we're using ADO.NET statements to get the data back in the form of DataTables.

DataTables, however, are really difficult to work with in MVC, because:

They're not strongly typed; the type of any given object in a DataRow's ItemArray is object.

We can't do validation on individual data pieces because of the no-type issue.

The tables can contain any number of rows.

This bothered me for quite a while, as I really couldn't mesh the philosophies of MVC and DataTables together in any meaningful way. The lack of strong-typing was the most egregious issue; I was reasonably sure that the value of a given column in a given row was of a particular type, but I couldn't use the benefits of that type.

Taking some inspiration from AutoMapper, we decided to create our own mapping engine that would take those DataTables and map them into collections of strongly-typed objects. I've taken those ideas and distilled them down into a sample project. This post details how we ended up creating this mapper class, and how well it worked (spoiler alert: pretty damn well).

The Problem

We had old procedures in our database that we still had to use for this rewrite. Those procedures were written in our organization's cowboy coding days, and as such had no real consistency in their naming and structure. Plus, there were a lot of them (100+). So, if we were going to build this mapping engine, we needed a generic solution that we could use for a lot of different types and source data.

To further complicate matters, the procedures' structure and naming weren't predictable at all. For example, we had two different procedures, each of which returned data representing a collection of users for this application, but each of which was named entirely differently. So we had to account for the possibility that two columns in different procedure calls would have different names but represent the same data.

Finally, we wanted the ability to not have to explicitly map each property to a column. Implicit in this requirement is the ability to determine the type and parse the source data accordingly.

In short, we needed a mapping engine that:

Was generic and extensible

Could handle multiple column names

Could discover the type of the destination property and parse the source data

SourceNamesAttribute

The first part of the problem we had to solve was this: How does the code determine which columns in a DataTable map to which properties? We decided that an attribute was the best solution for this, and created the SourceNamesAttribute:

The AttributeUsage attribute allows us to restrict our SourceNamesAttribute to only be used on properties of a class, not classes themselves.

SourceNamesAttribute stores a list of column names for each property, so it supports our requirement that the mapping be tolerant of multiple column names. Now we can decorate the properties of User with this attribute:

The first parameter is the type of the destination model. We use propertyName to get the property of the object via Reflection, and then look for a SourceNamesAttribute on that property; if such an attribute exists, we return the source names.

Since we have proven that we can get the PropertyInfo for a given property, we can now tackle the other end of this problem: how to determine the property's type and parse accordingly.

The Mapper.Map() function takes four inputs: the Type of the destination property, a PropertyInfo instance, the source DataRow, and destination object ("entity"). It grabs the source column names and cycles through them, assuming that the first one it finds in the DataRow is the one we want, and then takes that value out of the row and attempts to parse it using ParsePrimitive().

ParsePrimitive() is only set up at the moment to parse the types we need for this example, so feel free to add you own parsing scenarios.