We're currently in a situation where we have a choice between using an out-of-the-box object-relational mapper or rolling our own

We have a legacy application (ASP.NET + SQL Server) where the data-layer & business-layer are unfortunately mashed together. The system isn't particularly complicated in terms of it's data access. It reads data from a large group (35-40) of inter-related tables, manipulates it in memory & saves it back to some other tables in a summary format. We now have opportunity for some refactoring and are looking at candidate technologies to use to seperate & properly structure our Data Access.

Whatever technology we decide on we would like to:

have POCO objects in our Domain Model which are Persistence Ignorant

have an abstraction layer to allow us to Unit Test our domain model objects against a mocked up underlying datasource

There's obviously lots of stuff out there on this already in terms of Patterns & Frameworks etc.

Personally I'm pushing for using EF in conjunction with the ADO.NET Unit Testable Repository Generator / POCO Entity Generator. It satisfies all of our requirements, can be easily bundled inside a Repo/UnitOfWork Pattern and our DB Structure is reasonably mature (having already undergone a refactor) such that we won't be making daily changes to the model.

Some of the arguments been thrown about are "Well at least we'll be in control of our own code" or "Oh I've used L2S/EF in a previous project and it was nothing but headaches". (Although I've used both in Production before and found any issues to be few and far between, and very manageable)

So do any of your uber-experienced devs/architects out there have any words of wisdom that might help me steer this product away from what seems to me like it's going to be a complete disaster. I can't help but think that any benefit gained by dodging EF issues, will be lost just as quickly by attempting to re-invent the wheel.

There are some good (better, if you ask me, but I won't start that holy war... oh wait) alternatives to EF, where you would be able to "control the code", such as NHibernate.
–
BrookJul 27 '11 at 15:48

@Brook How does that address the question of Is writing your own Data Access / Data Mapping Layer a “good” idea?
–
Micky DuncanApr 28 '14 at 7:16

8 Answers
8

Why not write your own language? Your own framework? Your own operating system? And to be sure you're in control of everything, it's also a good idea to create your own hardware.

"Oh I've used L2S/EF in a previous project and it was nothing but headaches"

EF is mature enough and was used successfully by plenty of people. It means that a person who claims that EF is nothing but a headache should probably start learning how to use EF properly.

So, do you have to write your own ORM?

I wouldn't suggest that. There are already professional-level ORMs, which means that you have to write your own only if:

You have a precise context where all existing ORMs cannot fit for some reason,

You are sure that the cost of creating and using your own ORM instead of learning an existing one is much lower. This includes the cost of future support, including by another team of developers who would have to read hundreds of pages of your technical documentation in order to understand how to work with your ORM.

Of course, nothing forbids you to write your own ORM just out of curiosity, to learn things. But you should not do it on a commercial project.

See also points 2 to 3 of my answer to the question Reinventing the wheel and NOT regretting it. You can see that the different reasons for reinventing the wheel don't apply here.

1. Being in control of business-critical parts of the system is often a good idea. Sometimes it may involve writing your own framework instead of using an established and popular library. 2. Sometimes popular tools are oversold or overhyped.
–
quant_devJul 27 '11 at 15:23

3

@quant_dev: if you're writing a software which will control a nuclear plant or a spacecraft, you're right. But I have some doubts that it is the case here. BTW, I'm not sure if we can call EF an "established and popular library".
–
MainMaJul 27 '11 at 15:26

3

There is a well-known Open Source library for derivatives pricing called Quantlib. But every bank I know of writes their own pricing libraries, because this is core to their business. Doesn't have to be a spacecraft...
–
quant_devJul 27 '11 at 15:28

2

It's also easier to hire new employees who have experience in the standard framework you are using than to have to train every single person you hire on how to use the Framework.
–
HLGEMJul 27 '11 at 17:10

2

Why not write your own language? Your own framework? Your own operating system? And to be sure you're in control of everything, it's also a good idea to create your own hardware That's what Apple said :-)
–
KonamimanApr 29 '13 at 7:05

Use Entity Framework for all of the ordinary CRUD stuff (80 to 95 percent of the code), and use custom data access code for any remaining code that needs to be optimized. This should satisfy the concerns of those who are arguing that you don't have enough control.

Especially since EF is the technology M$ is moving to. And linq makes dev life so much easier.
–
ChadJul 27 '11 at 19:55

That's pretty much the road StackOverflow went down, successfully, isn't it? Linq-To-SQL for all the ordinary stuff, and the custom stuff that turned into the Dapper library for the bits that needed it.
–
Carson63000Jul 28 '11 at 0:14

It's a good idea if and only if you can be (at least reasonably) certain that you'll save at least as much time/effort by writing your own code as it takes to generate that code in the first place. Depending on the situation, doing so may also impact your schedule, and you'll have to factor that in as well. You also have to factor long-term maintenance into the equation. If you roll your own ORM, you'll need to maintain it forever (or at least as long as you use it).

The bottom line is that it can make sense, under just the right conditions. These conditions generally include:

The project you're working on is quite large, so the cost is amortized over a large amount of use.

At the risk of stating the obvious, these two are sort of inversely related -- if you're working on a truly gargantuan project, then even a small savings at each individual use can add up to enough to justify the development. Conversely, if your needs are really unusual, so you gain a lot at every use, the work may be justified on even a fairly small project.

At least based on your description, however, neither really applies in your case. Perhaps one (or even both) applied to your coworker's previous use of EF, or perhaps he's just prejudiced -- I don't think you've given us enough information to even guess which (though in fairness, I should add that I'd guess that's because he hasn't told you nearly enough to guess).

If I was in charge of this situation, I think I'd have you and your coworkers each write kind of a position paper -- a list of strengths and weaknesses you see with each approach, along with real evidence to support each assertion/claim/whatever. The entire team would then get to (i.e., would be required to) critique each paper. The primary point in their critique would be to rate each point on two scales:

the degree to which the point seems accurate, well supported by facts, etc., and

the degree to which the point appears to apply to the project at hand.

Then I'd evaluate the papers and the critiques to come up with a decision (though being entirely honest, it might be difficult to say how much I was using them to make the decision, and how much I was using them to justify a decision I'd already made -- I know I probably shouldn't admit that, but the reality is that I'm human too...)

Edit:

If I wanted to be particularly nasty (or "educational" -- hard to tell the difference in this case) I'd have each of you attempt to develop an estimate of the overall development effort both using an existing ORM/DAL, and developing your own. Though it's only a guess, my immediate guess is that most of the people with grand plans for rolling their own wouldn't bother to turn in their estimates -- by the time they came up with an estimate of overall effort that was even remotely finished (and realistic), they'd realize that there was no way they could even come close to competing with using existing code. At the same time, it's always possible somebody would come up with something sufficiently reasonable that it would be worth considering and possibly even using -- especially if (for example) they worked at some sort of enhancement/massaging of an existing tool/framework instead of rolling their own from the ground up.

Edit 2: (sort of at @CmdKey's suggestion): Although not explicitly mentioned, I assume an estimate will implicitly include an assessment of risk. In particular, an estimate of time should normally include PERT-style numbers:

The best-case estimate of the fastest it could be done if everything went right.

The "normal" estimate -- how long you expect it to take.

The worst case estimate of the longest it could take if everything went wrong.

Of course, the best- and worst-case estimates shouldn't normally include things like direct divine intervention, but should include just about anything short of that.

@Jerry excellent point about the cost, in the end its what managment cares about, however you need to include a measure of risk in your "educational" request. A failure to appreciate the risks invloved in a project is normally what makes lowest bid projects more expensive than other options.
–
CdMnkyJul 28 '11 at 11:23

Usually it's never a good idea to reinvent the wheel, and doing so is usually an indication fo technical ignorance ("I don't know anything else, and don't want to learn") or arrogance ("X is stupid, I can write my own tool in two weeks!"); There are mature tools that can do things a lot better than what some average developer can hack together in a few weeks.

That said however, there are times where you can't reliably fit a legacy application around an existing solution (no matter how flexible it is) without a ton of work; in a case like this it's not a "good" idea, but a better idea than the alternatives (i.e. leaving it as it is, or rewriting half of it to be able to use the third party layer) so you have little choice.

I was on a project that rejected the existing Java ORM tools to write their own, due to some "critical" feature that was not present in Hibernate. This was a multi-million dollar mistake, and the cost is increasing daily. They still do not have full support for object relationships. So in addition to all the time spent creating and maintaining the framework, they are spending hours writing code that could be written in minutes using Hibernate, or in many cases would not need to be written at all.

The existing frameworks are the product of tens of man-years of effort. It is absurdly naive to imagine that a single team could come anywhere close in a few man-weeks.

The only reason people are trying to 'reinvent the wheel' is because they are not familiar enough with the available wheels... which are often mature and well documented (ugh, I guess this is where the analogy breaks).

I'd say rolling out your own ORM is a BAD idea - unless you have a very, very god reason to do so (btw. the ones you cited aren't good enough :)

I made a mistake once of having others convince me not to use ORM and I can tell you that writing all the DAL ourselves really slowed us down and instead of implementing features that mattered for business we were spending time on DAL (it's a lot more work than it looks).

Code you have to write is code you have to maintain. Time spent maintaining the ORM code is time spent not maintaining the app. Unless the ORM gives you some kind of strategic advantage, then it's not something that's going to generate money for the business, so it's pure overhead. Would your company rather spend money on overhead or enhancing a money-making product? If this is for an internal app then this may be a non-issue, but you're still increasing the amount of code that's going to need to be written, tested and documented.