A good developer I work with told me recently about some difficulty he had in implementing a feature in some code we had inherited; he said the problem was that the code was difficult to follow. From that, I looked deeper into the product and realised how difficult it was to see the code path.

It used so many interfaces and abstract layers, that trying to understand where things began and ended was quite difficult. It got me thinking about the times I had looked at past projects (before I was so aware of clean code principles) and found it extremely difficult to get around in the project, mainly because my code navigation tools would always land me at an interface. It would take a lot of extra effort to find the concrete implementation or where something was wired up in some plugin type architecture.

I know some developers strictly turn down dependency injection containers for this very reason. It confuses the path of the software so much that the difficulty of code navigation is exponentially increased.

My question is: when a framework or pattern introduces so much overhead like this, is it worth it? Is it a symptom of a poorly implemented pattern?

I guess a developer should look to the bigger picture of what that abstractions brings to the project to help them get through the frustration. Usually though, it's difficult to make them see that big picture. I know I've failed to sell the needs of IOC and DI with TDD. For those developers, use of those tools just cramps code readability far too much.

6 Answers
6

Even though the languages themselves don't necessarily cause or prevent this, I think there's something to his notion that it's related to languages (or at least language communities) to some degree anyway. In particular, even though you can run into sort of the same problem in different languages, it'll often take rather different forms in different languages.

Just for example, when you run into this in C++, chances are that it's a less a result of too much abstraction, and more a result of too much cleverness. Just for example, the programmer has hidden the crucial transformation that's happening (that you can't find) in a special iterator, so what looks like it's just copying data from one place to another really has a number of side effects that have nothing to do with that copying of the data. Just to keep things interesting, this is interleaved with output that's created as a side effect of creating a temporary object in the course of casting one type of object to another.

By contrast, when you run into it in Java, you're much more likely to be seeing some variant of the well known "enterprise hello world", where instead of a single trivial class that does something simple, you get an abstract base class and a concrete derived class that implements interface X, and is created by a factory class in a DI framework, etc. The 10 lines of code that do the real work are buried under 5000 lines of infrastructure.

Some of it depends on the environment at least as much as the language -- working directly with windowing environments like X11 and MS Windows is notorious for turning a trivial "hello world" program into 300+ lines of nearly indecipherable garbage. Over time, we've developed various toolkits to insulate us from that as well -- but 1) those toolkits are quite non-trivial themselves, and 2) the end result is still not only bigger and more complex, but also usually less flexible than a text-mode equivalent (e.g., even though it's just printing out some text, redirecting it to a file is rarely possible/supported).

To answer (at least part of) the original question: at least when I've seen it, it was less a matter of a poor implementation of a pattern than of simply applying a pattern that was inappropriate to the task at hand -- most often of attempting to apply some pattern that might well be useful in a program that's unavoidably huge and complex, but when applied to a smaller problem ends up making it huge and complex as well, even though in this case the size and complexity really was avoidable.

I find that this is often caused by not taking a YAGNI approach. Everything going through interfaces, even though there is only one concrete implementation and no current plans to introduce others, is a prime example of adding complexity that You Ain't Gonna Need. It's probably heresy but I feel the same way about a lot of usage of dependency injection.

+1 for mentioning YAGNI and abstractions with single reference points. The primary role of making an abstraction is factoring out the common point of multiple things. If an abstraction is referenced only from one point, we cannot speak of factoring out common stuff, an abstraction like this just contributes to the yoyo problem. I would extend this because this is true for all kinds of abstractions: functions, generics, macros, whatever...
–
CalmariusApr 1 '14 at 9:26

I think that avoiding deep hierarchy and naming are the most important point to look at for the case you describe. If the abstractions were well named, you wouldn't have to go too deep, only to the abstraction level where you need to understand what happens. Naming allow you to identify where is this level of abstraction.

The problem arise in low level code, when you really need all the process to be understood. Then, the encapsulation via clearly isolated modules is the only help.

Well, not enough abstraction and your code is hard to understand because you can't isolate what parts does what. That's encapsulation, not abstraction. You can isolate parts in concrete classes without much abstraction.
–
StatementMar 22 '11 at 23:17

Classes are not the only abstractions we're using : functions, modules/libraries, services, etc. In your classes you usually abstract each functionality behind a function/method, that can call other method that abstract each one another functionallity.
–
KlaimMar 22 '11 at 23:19

I don't see how this has anything to do with the language of choice. Abstractions are a high level language-independent concept.
–
Ed S.Mar 22 '11 at 23:30

@Ed: Some abstractions are more simply realizable in some languages than in others.
–
kevin clineMar 23 '11 at 4:30

Yes, but that doesn't mean you can't write a perfectly maintainable and easily understood abstraction in those languages. My point was that your answer doesn't answer the question or help the OP in any way.
–
Ed S.Mar 23 '11 at 17:25

Poor understanding of design patterns tends to be a major causation of this problem. One of the worst I've seen for this yo-yo'ing and bouncing from interface to interface without very much concrete data in between was an extension for Oracle's Grid Control.
It honestly looked like someone had had an abstract factory method and decorator pattern orgasm all over my Java code. And it left me feeling just as hollow and alone.