As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

@Kramil, did you recognize when you wrote the poorly written code that it was poorly written, and if so why did you write it that way? If not, then what has happened since you can recognize it now, as opposed to earlier.
–
user1249Apr 30 '11 at 22:56

4

@Kramii, no "best practice" ever can be a substitute for a rational, critical thought. All the "best practices" are nothing but tools, and using them blindly would be just harmful. It is stupid to follow something just because it is considered a "best practice", without understanding the rationale behind it. But with such an understanding you won't need your "best practices" to follow. Therefore, the very notion of "best practice" is deeply flawed and inherently harmful.
–
SK-logicJun 25 '12 at 10:45

15 Answers
15

Resistance to change.

Chapter 30 of Peopleware 2nd Edition is devoted to this topic. And it quotes from a book by another fairly well known consultant, written a bit earlier though:

And it should be considered that nothing is more difficult to handle, more doubtful of success, nor more dangerous to manage, then to put oneself at the head of introducing new orders. For the introducer has all those who benefit from the old orders as enemies, and he has lukewarm defenders in all those who might benefit from the new orders.

Niccolo Machiavelli: The Prince (1513)

DeMarco and Lister go on stating the mantra to keep in mind before asking people to change:

The fundamental response to change is not logical, but emotional.

The process of change is rarely a straight and smooth drive from the current suboptimal conditions to the new, improved world. For any nontrivial change, there is always a period of confusion and chaos before arriving to the new status quo. Learning new tools, processes and ways of thinking is hard, and takes time. During this transition time productivity drops, morale suffers, people may complain and wish if only it was possible to return to the good old ways of doing things. Very often they do, even with all the problems, because they feel that the good ol' known problems are better than the new, unknown, frustrating and embarrassing problems. This is the resistance which must be tactfully and gently, but decidedly overcome in order to succeed.

With patience and perseverance, eventually the team arrives from Chaos to the next stage, Practice and Integration. People, although not completely comfortable with the new tools/processes, start to get the hang of these. There may be positive "Aha" experiences. And gradually, the team achieves a new status quo.

It is really important to realize that chaos is an integral, unavoidable part of the process of change. Without this knowledge - and preparation for it -, one may panic upon hitting the Chaos phase, and mistake it with the new status quo. Subsequently the change process is abandoned and the team returns to its earlier miserable state, but with even less hope of ever improving anything...

I think this is true for more established programmers, but I think they make up only a very small percentage of those who do not code by best practice. All of the code I've seen that do not follow best practice came from two other answers here - lack of time and naivety.
–
AndrewKSApr 26 '11 at 16:26

1

@AndrewKS, this concerns not only developers but managers and clients too. In a good development team and process, deadlines are realistic and developers aren't assigned tasks above their current capabilities - or at least not without proper mentoring and verification. Failure in these is almost always a sign that management resists adopting best practices.
–
Péter TörökApr 27 '11 at 7:20

Really good point - I wasn't thinking about non-programmers in this situation until now.
–
AndrewKSApr 27 '11 at 15:37

Time is huge here too. My boss actually told us in a staff meeting, "The Business doesn't pay for great work".
–
Joshua SmithApr 26 '11 at 15:18

@Joshua Smith: wtf!? Seriously? I wouldn't be surprised if they get exactly what they do pay for.
–
Steve EversApr 26 '11 at 17:55

2

I have often seen the approach that we can't afford to do it right. But, we can afford to spend endless time fixing the mess. For consulting companies that bill by the hour, which approach is best?
–
BillThorApr 26 '11 at 18:25

1

@jwenting: To put my earlier comment in context, I was advocating unit tests in a staff meeting. Currently, only two of us are writing unit tests (and we have to do that on the sly). Personally I don't consider unit tests to be gold trim and diamond decorations.
–
Joshua SmithMay 11 '11 at 15:42

1

@shapr: That is a f*cking terrifying thing to hear from a company that builds ROCKETS AND MISSILES. >:P
–
ajax81Jun 26 '12 at 0:57

Despite the massive evidence to the contrary, people are usually very rational creatures. The reason that people don't adopt best practices, like TDD for example, is that they don't think it will be worth it. Either they think that the practices is not, in fact, best; and that it will actually cost them more than it will save them. Or they think that the benefit is so marginal that the cost of change will outweigh the tiny benefit. The bottom line is that the problem of best practices is a problem of the bottom line.

If you want to be a change agent, your task is to demonstrate that their perception of the bottom line is flawed. You need to show that the best practice really is best. That that the benefits are immediate and significant. You need to show that the learning curve is small enough to endure, and that at least some of the benefits begin immediately.

How do you show this? By adopting the best practice yourself! Nothing serves to convince others better than your own behavior. If you follow the best practice, and are vocal about it, others will see that you did the economic analysis and made the opposite decision. That will make them reconsider their decision. They will do this privately, and never admit it at first. But everyone who sees you using that best practice, will re-examine their position.

That's the best you can hope for. If the best practice is a best practice, then the rest will follow automatically. Oh it won't be fast, and there will be many hold-outs. The transition will be slow and spotty; and you'll experience many disappointments. But the transition will also be inexorable and inevitable. It might take a generation, but the best practice will win.

It is the result of not knowing, or thinking that one knows the ideal method. People aren't choosing to write code poorly; it's more a case of not knowing better. "Naivety", as opposed to "ignorance" would be a better word.

The simplest way to conform to good practice is acknowledging there is (most likely) a better way to write the code that you've just written, and then aspiring to learn what that 'better way' is.

+1 and not all developers are given or take enough time to learn or explore better ways. for many (management and developers), it's deferred until it has become an obstacle which cannot be ignored. meanwhile, resolutions are not made heuristically often enough - it's common for many people to accept an existing solution or recommendation instead. this practice involves chance, approximation, and bypasses much of the learning process, which is vital to understanding. in turn, not understanding (adversely) affects one's ability to make the better choices.
–
justinApr 28 '11 at 5:57

Until managers (i.e., the folks buying the developer's skill) require best practices as part of the delivery of software, nothing can possibly change. Many organizations are clearly schizophrenic: they educate developers in various technologies and then demand untested software or an untestable design. Or worse.

Best Practice for me is a piece of software that a team of 8 people wrote. We had no written requirements, code reviews, Unit tests, format release documents. No end user testing, none of what all the books say we should have done. The code was rushed, buggy and plain impossible to maintain. It was discarded 3 years after release it was so bad.
So what was so great about it. Two years after the first release the company owner (who had funded the development with a mortgage on his own home) walked away with $30-$40 million in his back pocket.

We often loose sight of the fact we are (more often than not) here to make software that makes money for the business. Businesses do not exist to provide us an opportunity to develop software using "Best Practice", they exist to make money.

Most "Best practice" practices do not improve profits. Those that do, should and are, widely adopted. That's why "best practice" is not practiced.

Haven't you ever taken a risk and gambled on something? There's a time crunch, so you get it to work with the intention of refactoring it later (as opposed to refactoring it sooner). That's actually considered a good practice for some people.

Eventually, you get burned enough times and you change your ways. Think of all of the 'best practices' that are out there. Can you follow all of them all of the time? Do any contradict one another? You don't want to waste time handling every outlier.

If I write bad code that works and no one ever looks at it again, is it still considered bad?
(Please, let's not ruin the philosophical debate by arguing over what 'bad code' is.)

+1 for the notion that we are paid to produce code first. We have the added responsibility of (subjectively) making it maintainable, second. After all -- I'm not paying the gardener extra to maintenance his lawnmower -- that's up to him to manage. If he does a good job and his gear holds up, I'll invite him back and give him my business again.
–
ajax81Jun 26 '12 at 1:00

IME it's a combination of what others have said. Ignorance ("I've only ever used DataSets, why bother with this LINQ stuff?", lack of time ("The Boss wants it done as soon as possible, we can't spend a lot of time on it"), lack of understanding ("What's wrong with writing all my code in the code-behind?") all contribute.

The irony that I see is once you start down that path, you just mire yourself deeper. If you cut corners now and toss all the code for an app in the ASPX files, you're never going to be able to fix it later on; new things will keep being thrown at you which will mean you have to do them quick, which means you have to just toss all the code in the ASPX file (swearing you'll fix it some day), etc. etc. it's a never-ending spiral because you can no longer stop development and say things need to be fixed first.

Often developers have simply not been shown the better practice or more importantly the benefits of using a better practice (I'm starting to stop using the term 'Best practices' for various reasons).

There's a theory that developers are 'deliberately lazy'. In other words, they will tend to choose the path of least resistance.

Once you quickly show them the benefits of a better practice (such as TDD, or say, following the SOLID principles) and that it actually allows them to be a better (but still 'lazy') developer then you tend to find that resistance melts away.

I learned programming in a short session (2 to 3 hours). (Actually a few sessions with different languages.) During the sessions very good code was shown, and the reason for writing the code as it was was explained. None, of my "official" language courses ever came close in introducing good code.
–
BillThorApr 26 '11 at 18:20

"least resistance" is quite descriptive. Experienced programmers just have a better idea of what "least resistance" over the whole lifetime of the application means.
–
user1249Apr 27 '11 at 6:40

I'm surprised no one else has put this one out, but maybe it's obvious to me because so much of my work has been as a singleton programmer, with no one (other than stack) to go to when I struggle on something. I know 'of' better techniques -- such as TDD -- but all to often I don't understand them well enough to use them and have a hard time finding good information to help me start using them. Again, using TDD as an example, I understand the basic meaning of it -- build tests that assert your code outputs a specific result -- but actually implementing it?

Other than the fact that XCode has built-in unit-testing these days, I don't have a clue where to start. Do I assert that view has X buttons on it after I run a routine to make them? How about asserting that my UIViews have been re-arranged appropriately after I call the rotate tag?

Heck, how do I even write a unit test in XCode? That's something else I need to spend time learning.

Yes I agree sometimes time and budget are some constraints which don’t let you put every “better programming” principle you know into a program.

You know that the current task can be done in much more robust way if only you had some more time. But very few clients (especially if they are getting their first iPhone App done or their first custom made business intelligence software) understand when you say that you are re-implementing something already done because you have found a better way of doing it.

Same for Unit Tests. Unfortunately I am yet to meet a client who is ready to allocate budget for those. Typical reply is like “Automated regression testing OK I understand but unit tests? We don’t have time for that! We need to make it fast to the market!”

I never ask clients to allocate time for unit testing but consider it to be part of the job. Getting Clients to commit on unit tests just encourages your clients to micro manage your work. When you get your car repaired the mechanics do not ask you which tools he should use to get the job done !! same goes for unit tests, YOU have to judge the proper balance of tests that you feel is necessary to get the job done properly.
–
NewtopianApr 26 '11 at 17:58

Unfortunately, the better programming techniques may be faster than the techniques you can't afford to improve on.
–
BillThorApr 26 '11 at 18:22

getting enough people to agree on what practices are best for the current situation

"Best Practices" are VERY subjective in many real-world situations. If one half of the team is trying to do a bunch of SOLID & TDD while the other half is working 60hr weeks to shave seconds off of run durations here and there through any means necessary because you are past the point where it is too late to redesign something that isn't performing in time for your next release, you aren't going to get very far.

Even if you aren't experiencing too much disagreement across the team it is a lot of effort to get formal agreement, documentation and training on whatever policy you decide you are going to follow. That is a large block of non-productive time from the business point of view that you are going to have to justify carefully.

Sometimes, you'll find that habit is also a major contributor to writing awful code.

You might be a very experienced programmer and you might know all about present industry best practices. Hell, you could even be the expert about such things. Experience tells me that such people don't often write awful code, and yet it can be easy to fall back on old habits and do something that you know will break the rules, simply because it feels safe and convenient. In such cases, ignorance and arrogance and all of the other finger-pointing adjectives you might think up don't really matter. It's not necessarily that such programmers are specifically bad at what they do (although this is more often the case), or even that they want to be creating an awful mess.

It's unfortunate that I have personally witnessed this happening more times than I would like to count, where some truly talented people have churned out sheer rubbish because their fingers and brains were operating on auto for a couple of months. Most often this is where you see evidence of burn-out, grief and stress piling up. Sometimes it really is just blind habit carrying them through the daily motions. It's something I've learned to keep a careful eye out for in order to avoid risking labeling vulnerable people unnecessarily.

Just some food for thought for those of us who find it easier to simply leap to the negative conclusion... even though sadly you'll be right more often than not.

No one shows interest to pay for a project titled something along refactoring. It has to have some words that appeal to 'business' interests. Words such as revamping, reinvigorating, total remake, life cycle upgrade, etc work at my work place. Almost all of them have refactoring as a main part of the project.

Sadly, thanks to the economic hitman, work happens only when there is pay. Even if the work is only to save the business fortunes in the long run.