What are the worst anti-patterns you have came across in your career as a programmer?

I'm mostly involved in java, although it is probably language-independent.

I think the worst of it is what I call the main anti-pattern. It means program consisting of single, extremely big class (sometimes accompanied with a pair of little classes) which contains all logic. Typically with a big loop in which all business logic is contained, sometimes having tens of thousands of lines of code.

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. This question and its answers are frozen and cannot be changed. More info: help center.

Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise.
If this question can be reworded to fit the rules in the help center, please edit the question.

Well at least you can safely delete it. Unlike someone I've worked with that would frequently rename procedures and leave the dead code accessible. Blech.
–
CyrenaFeb 4 '11 at 18:25

9

This is another case where using version control pays off. You can delete code without having to ask whether or not you may need it in the future because it can be retrieved again with a few keystrokes.
–
JasonFeb 4 '11 at 20:28

5

I think commented-out code has a place, if you prefix it with a comment that explains why its there. Sometimes I will leave a commented block that represents a potential road I was thinking about taking, and leave it with a comment about why I didn't. @Jason, I definitely hear you about this being the role of version control, but deleted code in version control is not very discoverable.
–
nlawalkerFeb 4 '11 at 20:42

I'll hit another obvious one with "copy-pasta". Copying code that is nearly identical to what you want, then changing a few things (instead of extracting into a method).

This is especially prevalent in some functional and API test code from the late 90's: Literally hundreds (or even thousands) of nearly-identical test cases that could have been broken out into a few functions that take 3 or 4 parameters - or, even better, something data-driven. My first job out of college was literally 8 months of rewriting and refactoring thousands of lines of copy-pasta that were using deprecated methods. By the time I was done, the test files were less than a 10th of their original size and far more maintainable (and readable!).

I think that I can write a lot about Pattern Mania and solutions that could be solved with a lot less effort, but I would rather point to a great article that I've read recently with a fantastic example of how a simple solution can be overcomplicated.

I call it Pattern Fever...everyone gets it at one point in their career. The better among us grow beyond it. I had an interview where the guy asked me when should I think about patterns. I told him that I let them evolve into the system. He said "No, you should use patterns from the start."
–
Mike BrownFeb 4 '11 at 18:30

In C# you can define a region of code that can be collapsed in the IDE, thus hiding it unless you want to deal with that code. I've been on (am currently on) a project where the regions spanned hundreds of lines (wish I was exaggerating) and there were multiple regions in a thousand line function (again I wish I was kidding).

On the upside, the developer who made the region did a very good job at identifying specific functions within a region. So much so that I was able to do an extract method on the region and move on.

+1 for Regions encourage developers to "hide" their crap in plain sight. However, when used properly, I find that regions help to logically group together code and easily find at a later time.
–
Darren YoungFeb 4 '11 at 19:17

I have projects which work with a huge number of fields in SQL, and the code that updates/creates it is many lines. A region #region SQL Update makes it collapsible so less scrolling is required.
–
JYeltonFeb 5 '11 at 0:13

I hate when I see a developer hasn't done a check-in in over a week. It means he's either stuck and hasn't looked for help, or he's clumping a bunch of features into one big check-in. (I left out the worst case scenario, he just isn't doing anything. That's easy to resolved...two words sounds like you're hired.)

If you're making a big check-in you lose a lot of benefits of SCM, like being able to link a changeset to a specific feature. Also you're likely to have a lot of merging to do and that's not necessarily easy to get right.

One of the worst behavioral anti patterns I have seen was a shop that only allowed code to be checked in to version control after it was in production. Coupled with exclusive checkouts in VSS, this led to a convoluted process of undoing checkouts if a production defect needed to be corrected before the next release, let alone more than one person needing to change a given file for the upcoming release.

The fact that this was a departmental policy made it even worse than the case of a single developer behaving this way.

@Peter: "Calls to System.gc() to free some memory", I have seen one occasion where .NET would explode with OutOfMemory exception unless GC was called explicitly. Of course it was more of an exception rather than a rule.
–
CoderAug 25 '11 at 21:49

The singleton is a factory. There is nothing wrong with that code. The only thing that could be wrong is that outside code makes the assumption, that every call to getInstance returns the same instance. Such an assumption would break encapsulation and is in fact one of the most common anti-patterns.
–
back2dosAug 25 '11 at 14:59

It's not so much of a coding pattern but a behavioural pattern, it's quite bad though: modifying something in the code (let's say the requirements changed), then tweaking all the unit tests until the code passes it. Tweaking or simply removing all test code from the test method, but leaving the method there.

This is linked to a more generic one though, the That'll Do pattern, here's a representative line of code for it:

I absolutely despise abstraction inversion, or the reinventing of low-level primitives on top of high-level primitives. Sometimes, though, this is caused by bad language designers, not bad programmers. Examples:

Using a single method class with no member variables and a corresponding interface, (implemented in terms of tables of function pointers), instead of a function pointer. Note that in languages like Java, you may have no choice.

In MATLAB and R, the insistence that everything is a vector/matrix rather than a primitive.

While we're bashing MATLAB, how about the fact that it has no integers, so when you need an integer you have to use a double.

Purely functional languages where you have to use function calls to write a loop.

It would be definitely copy/paste, I've seen a lot of bad code getting done because of it, that and cowboy coding and everything that derives from that. (God Classes, extra large methods, bad thinked algorithms, etc.)

And if design patterns are allowed, them I would say: Doing a project as a set of actions from a plan, without doing any design or domain analysis.

A java bean with lots of variables, used in a few different kinds of operations. Each operation uses some arbitrary subset of the bean variables, and ignores the others. A few variables for gui state, a few variables thrown in for passing around between components, a few that probably aren't even used anymore. Best examples have no documentation, which would hinder appreciation of the pattern.

A former co-worker had the habit of re-using objects by overwriting their properties, instead of simply creating new ones. I could never have imagined the amount of problems this has caused when implementing new features.

Something that has caused me lots of grief is the "Big map in the sky" pattern.
Throwing around a map instead of using proper objects.
You have no idea of what "variables" it contains without debugging, and you don't know what it might contain without tracing the code backwards.
Typically maps Strings to Objects, or Strings to Strings, which you potentially are supposed to parse into primitives.

One of my favorite development anti patterns is the use of a database "design" that requires continually adding additional columns to one or more tables programmatically. This is a cousin of the "design" that creates a new table for each instance of an entity. Both inevitably hit limitations of the database server, but usually not until the system has been in production for quite some time.

I think one of the worst anti-patterns I have seen involves using a Database table as temporary storage instead of using computer memory.

The problem domain is proprietary which disallows me from explaining it, but it isn't necessary to understand the basic problem. This was a GUI application written in Java with a backend Database. It was to take certain input data, manipulate it and then commit the processed data to the database.

Our project has a fairly complicated algorithm which saves intermediate values for processing later on. Instead of encapsulating the temporary objects in... objects, a database table was created like so "t_object". Every time a value was calculated, it was added to this table. After the algorithm finished it's work, it would select out all of the intermediate values and process them all in one large Map object. After all the processing had been finished, the remaining values that were marked to be saved would be added to the real database schema and the temporary entries in the "t_object" table would be thrown away.

The table was also used like a unique list, data could only exist once. This might have been a decent feature of the design had we implemented Constraints on the table, but we ended up iterating through the whole table to see if the data existed or not. (No we didn't even use queries that used where clauses with CONTAINS)

Some of the problems we ran into due to this design were specifically debugging. The application was built to pipeline data so there would be multiple GUIs that would pre-process the data before it arrived at this algorithm. The debugging process was to process a test case and then pause right after finishing the above section. Then we would query the database to see what data was contained in this table.

Another problem we discovered was that data wasn't being deleted properly from this temporary table, which would interfere with run's in the future. We discovered this was due to Exceptions not being properly handled and therefore the application not exiting properly and not deleting the data in the table it was in control of.

If we had used Basic Object-Oriented Design and kept everything in Memory, these above problems would never have occurred. First, debugging would have been simple as we could easily setup break points in the application and then inspect the memory in the stack and heap. Secondly, upon abnormal exit from the application, java memory would have been cleaned up naturally without having to worry about deleting it from the database.

NOTE: I'm not saying this pattern is inherently bad, but in this example I found it to be unnecessary when basic OO principles would have sufficed.

I'm not sure of a name for this anti-pattern as this is the first time I have seen anything like this done. Any good names you guys can think of for this pattern?

You don't mention your problem domain, but not always a bad thing, keeping state elsewhere like this could allow processes to scale and to be introspected during long run times, also to help in debugging. Was the DB access causing performance issues or concerns?
–
XepochFeb 4 '11 at 20:43

IMO the worst anti-pattern I've seen is the "We don't need any stinkin' patterns" Anti-Pattern: The idea that design patterns are wastes of time and you can write code faster by just slopping it together and copy/pasting as needed.

Honorable mention goes to having code that loads an object from the database using the old VB6 style of:

not really an anti-pattern, per se, but shows a lack of taking advantage of proper architecture and separation of concerns.

Also, things like this:

// this name is misleading, we may not always want to stand in fire,
// we may want to stand in slime or voidzones or ice patches...
public Foobar StandInFire() { }
// why is this here???
public string BeatWithNerfBat(string whom) { }
// ????
public int GivePony(string to) { }