Answer: Be An Agent of Change (3 Votes)

Despite the massive evidence to the contrary, people are usually very rational creatures. The reason that people don't adopt best practices, like TDD for example, is that they don't think it will be worth it. Either they think that the practices are not, in fact, best; and that it will actually cost them more than it will save them. Or they think that the benefit is so marginal that the cost of change will outweigh the tiny benefit. The bottom line is that the problem of best practices is a problem of the bottom line.

If you want to be a change agent, your task is to demonstrate that someone's perception of the bottom line is flawed. You need to show that the best practice really is best. That the benefits are immediate and significant. You need to show that the learning curve is small enough to endure and that at least some of the benefits begin immediately.

How do you show this? By adopting the best practice yourself! Nothing serves to convince others better than your own behavior. If you follow the best practice and are vocal about it, others will see that you did the economic analysis and made the opposite decision. That will make them reconsider their decision. They will do this privately and never admit it at first. But everyone who sees you using that best practice will re-examine their position.

That's the best you can hope for. If the best practice is a best practice, then the rest will follow automatically. Oh, it won't be fast, and there will be many hold-outs. The transition will be slow and spotty, and you'll experience many disappointments. But the transition will also be inexorable and inevitable. It might take a generation, but the best practice will win.

"And it should be considered that nothing is more difficult to handle, more doubtful of success, nor more dangerous to manage, then to put oneself at the head of introducing new orders. For the introducer has all those who benefit from the old orders as enemies, and he has lukewarm defenders in all those who might benefit from the new orders."

DeMarco and Lister go on stating the mantra to keep in mind before asking people to change:

"The fundamental response to change is not logical, but emotional."

The process of change is rarely a straight and smooth drive from the current sub-optimal conditions to the new, improved world. For any nontrivial change, there is always a period of confusion and chaos before arriving to the new status quo. Learning new tools, processes, and ways of thinking is hard, and it takes time. During this transition, productivity drops, morale suffers, and people may complain and wish if only it was possible to return to the good old days of doing things. Very often they do, even with all the problems, because they feel that the good ol' known problems are better than the new, unknown, frustrating, and embarrassing problems. This is the resistance which must be tactfully and gently, but decidedly overcome in order to succeed.

With patience and perseverance, eventually the team arrives from Chaos to the next stage: Practice and Integration. People, although not completely comfortable with the new tools/processes, start to get the hang of these. There may be positive "Aha" experiences. And gradually, the team achieves a new status quo.

It is really important to realize that chaos is an integral, unavoidable part of the process of change. Without this knowledge—and preparation for it—one may panic upon hitting the Chaos phase and mistake it with the new status quo. Subsequently, the change process is abandoned and the team returns to its earlier miserable state but with even less hope of ever improving anything.

For reference, the phases described above were originally defined in the Satir Change Model (named after Virginia Satir).

Answer: Accept Naivety, Aspire to Improve (6 Votes)

It is the result of not knowing or thinking that one knows the ideal method. People aren't choosing to write code poorly; it's more a case of not knowing better. Naivety might be a better word for this than ignorance.

The simplest way to conform to good practice is acknowledging there is (most likely) a better way to write the code that you've just written and then aspiring to learn what that 'better way' might be.

Answer: Ignorance, Arrogance & Incentives (5 Votes)

Until managers (i.e., the folks buying the developer's skill) require best practices as part of the delivery of software, nothing can possibly change. Many organizations are schizophrenic: they educate developers in various technologies and then demand untested software or an untestable design. Or worse.

Are you familiar with the barriers to adopting best practice and do you know how to overcome them? Disagree with the opinions expressed above? Bring your wisdom to the original post at Stack Exchange, a network of 80+ sites where you can freely trade expert knowledge on topics like web apps, cycling, scientific skepticism, and (almost) everything in between.

26 Reader Comments

From the developer point of view, a factor for not adopting best practices is because the program design is focused on solving the logical problem in the most straightforward and simplest manner that offers the most performance. This design thought does not account for traceability, reusability, or maintainability. That's why it is helpful to get the developer to gain perspective of these software best practices if they are rotated into different software operations that includes not only development, but also support and maintenance roles.

Naive, naive, naive. In my experience almost everyone involved in the actual implementation wants to do things the right way, and even the team manager wants to do things the right way (though there have been spectacular holdouts in both categories). We're all engineers.

But there's just never enough time budgeted, so you kick things out as fast and as cleanly as you can given the time constraints. You express your concerns about the process, your manager expresses his concerns about the process, and the answer from above and from the customer is 'Yes, we understand now can we have it by Monday?'

Our current major customer prefers getting unfinished stuff ASAP and then taking months to refactor it in increments, in the end taking twice as long. And their people /know/ it will take twice as long this way. But they need rapid deliverables they can show to their suits to demonstrate all the progress they are making.

You've got lazy incompetents or cowboys, but the major reason 'best practices' aren't followed is time pressure from above.

Wow, I had no idea Uncle Bob Martin is on Stack Exchange! Very few of the programming gurus I admire are on there, though no doubt because they're too busy with their writing/conference/speaking schedules.

The simplest answer is looking at what gets rewarded (above average raises, climbing the greasy pole, put on other desirable projects, listened to etc) and what doesn't (below average raises, passed over, ignored etc).

Using the LinkedIn unsalted passwords as an example, which companies would have rewarded someone who said this is something we must fix at the expense of other user and revenue visible issues?

I also like to keep pointing out that half of developers are below average(*). So the question should really be about the developers in the middle (most of them) and why they don't adopt best practises.

We just don't have enough resources to keep the company afloat and completely adopt state of the art best practice. We have been able to move to a scrum-like development process, and we have on the order of a thousand functional tests, but currently I don't see how we can move to unit tests. It takes an hour or so to build our apps from scratch and 4 plus hours to run all the functional tests. We revived an old product out of retirement (started in mid-80s), and we have 4 programmer types and maybe 13 employees total in the company, and there used to be on the order of at least 10 times that many. Our customers won't allow us to stand still and change over to a modern approach and still generate revenue.

I think part of the problem is people each have their opinion of what "best practice" means and either there's not enough actual evidence to support one conclusion or another or else it's just too subjective a topic.

Best practice is such a large issue, and it's not just ignorance, laziness, poor teaching or time management.

You could probably start an awesome flame war by writing down what you meant by best practice when you asked this question. I can probably come up with a dozen rules for most languages, many of which would be mutually exclusive, and wouldn't necessarily apply to other programming environments.

In every language I use I see myself making trade-offs between simple and correct, and in every language the approach is different.

Often the most legible code just ignores all the errors in the theory that failure is either fatal and crashing inevitable, or can just be ignored. In other words, almost a complete refutation of best practice.

I see best practice as being theoretical and theory and practice being equal in theory, but not in practice.

"Best practice" is meaningless without a context. It's a matter of being pragmatic - writing an in-house time-logging script doesn't carry anything like the risk of writing embedded code for an airliner, and the costs are proportionally lower too. Calling for uniformly high standards would be akin to imposing a 1m/h speed limit on all roads - there would be no accidents, but the whole transport system would be undermined.

The biggest problem for us, is the simple issue of "best practice" being a moving target.

Any change in how we do things requires discussion and experimentation among a small group, then training among the whole company, followed by an often lengthy period of mistakes and unexpected issues.

It's expensive, and after going through all that pain to finally produce a good end result... some new "best practice" has appeared.

Quite simply, a small business like ours (10 to 20 employees) cannot afford to be implementing new processes all the time. It needs to be an occasional thing, so we can focus on actually getting work done.

Some of the resistance to adopting best coding practices may be similar to the resistance to adopting best diet and exercise practice. Both have significant start-up costs (change) and continuing costs (extra time and effort to maintain the better practices) and the benefits can be difficult to measure even if a journal is kept and analyzed (e.g., in coding some of the effects are not local to overall programmer productivity but reduce support overhead or improve user satisfaction [e.g., waiting a known but longer time for a working product can be less frustrating that probably getting a probably sort-of-working product earlier]).

In addition, in both cases external cultural and environmental factors can increase actual and perceived costs. If best practices were nearly universal, the external support infrastructure would be more available and cheaper and would seem more "natural" (a good habit can be not only easier to maintain but easier to adopt if the external culture "supports" the habit).

Just two cents from a non-developer who has poor eating and exercise behavior.

The biggest problem for us, is the simple issue of "best practice" being a moving target.

Any change in how we do things requires discussion and experimentation among a small group, then training among the whole company, followed by an often lengthy period of mistakes and unexpected issues.

It's expensive, and after going through all that pain to finally produce a good end result... some new "best practice" has appeared.

Quite simply, a small business like ours (10 to 20 employees) cannot afford to be implementing new processes all the time. It needs to be an occasional thing, so we can focus on actually getting work done.

The end result is we do adopt best practices, but not all of them.

That's an excellent point, which can be worked around with a practice review once every, say, two or three years. Best practices change frequently, but not so frequently that you need to review your practices all the time.

In fact, every place I've worked at has sadly been roughly a decade behind the times when it comes to coding standards, unit testing, etc. I'm trying my hardest to impose viable change with my current team, though it is really tough, as many of my teammates don't immediately see the benefit to some of these concepts. I've been teaching myself better practices to keep up with the industry, but when you don't have many years of experience in some of these concepts, it's very difficult to properly introduce them to others.

Just two cents from a non-developer who has poor eating and exercise behavior.

This (and the stuff above it). Best practices are not free, and they don't come easy. It takes time to build proper habits, to find what works (i.e. metrics), to get buy-in, etc. And when you've figured that out, it's hard to change. This applies to coding, eating, exercise, and any other discipline that one deems worthwhile. Unfortunately, in the commercial coding environment, the bottom line often dictates that it is not.

On the bright side, there are many best practices out there that smart people have figured out already, which gives you a head start in trying to implement them. And I certainly agree that it starts with oneself.

Bad roll-out/implementation. In the past decade I've seen new management come in, try to bring their best practice philosophy-du-jour into play and fail miserably, only to see the next regime come in with their flavor and fail as well.

For example, I really like the ideas )and tools) of agile development (it seems kind of natural to me) and many of the associated procedural elements... but when it was pushed out to the general developer public it was done so without training and solid backing and scrums became nightmares (as one example). I found that the concepts were fine, but the implementation was so lacking that (illogically) I really hate even hearing or typing the word "Agile" now.

Each developer is constantly expanding knowledge. <i>Look that this beautiful code I just wrote, too bad that stuff from 2 weeks ago is CRAP!</i> Sometimes nobody on the team knows any better. This is especially true for developers with less than 5 yrs experience. We don't know what we don't know.

Performance reasons - I've had to write completely unmaintainable code in order to get performance.

Language limitations - You can't always do the most elegant solution in every language on a project. Mixing languages should be avoided unless absolutely necessary, but "absolutely necessary" is a thick, blurry, line.

Prior best practices don't always agree with current best practices. Competing best practices are another issue. Which is more important - performance or maintainability or elegance or being bug free?

Risk / Reward - I've worked at a place where we didn't touch any lines of code unless absolutely necessary - not even to clean up indentation. The compiler always kicked out a new properly indented version of the code in the first pass, so we usually didn't bother looking directly at the source code at all, just the compiled output. Further, every line was marked with the specific change management authorization that allowed the last change. That could be easily traced back to an individual.

Another detail that I encounter is that program managers who are experts in methodologies of program management but not in the subject matter. They have no background to interrogate the SME's to gauge the actual quality of their responses. As the old idiom says, 'the devil is in the details'.

The biggest problem for us, is the simple issue of "best practice" being a moving target.

Any change in how we do things requires discussion and experimentation among a small group, then training among the whole company, followed by an often lengthy period of mistakes and unexpected issues.

It's expensive, and after going through all that pain to finally produce a good end result... some new "best practice" has appeared.

Quite simply, a small business like ours (10 to 20 employees) cannot afford to be implementing new processes all the time. It needs to be an occasional thing, so we can focus on actually getting work done.

The end result is we do adopt best practices, but not all of them.

That's an excellent point, which can be worked around with a practice review once every, say, two or three years. Best practices change frequently, but not so frequently that you need to review your practices all the time.

In fact, every place I've worked at has sadly been roughly a decade behind the times when it comes to coding standards, unit testing, etc. I'm trying my hardest to impose viable change with my current team, though it is really tough, as many of my teammates don't immediately see the benefit to some of these concepts. I've been teaching myself better practices to keep up with the industry, but when you don't have many years of experience in some of these concepts, it's very difficult to properly introduce them to others.

The key is to accept that "change" is the fundamental issue in developing software and more importantly, get your business stakeholders to accept it. Practices are merely there as a way to manage change. As languages, tools, desktops and programming techniques improve, change happens. Practices merely respond to this environment change to give people a framework to handle the incoming "change". Getting your team/company to embrace change and get comfortable with some level of chaos that results from change is critical to the success of managing change.

Things that work against you:- Complacency: When employees stop learning, change becomes inordinately harder to manage and complacency creates a negative feedback loop with respect to the "change is good" idea.- Culture of Blame: People work to two motivators, discipline and accolades. Culture that always looks to blame breeds politics and backstabbing. If change requires learning and change brings chaos where getting it right the first time might not be possible, then blame cultures will actively work agains thte learning that results from the chaos of change. Blame cultures enforce the status quo. Blame cultures sweep problems under the rug, creating technical debt because admitting the failure only invites more blame next time something goes wrong.- Technical Debt: If you develop software and don't know what technical debt is, go read about it. It's an eye opener and you'll begin to see where your current team creates it. Technical debt can take down a team. Enough of it can kill a company. It can paralyze your ability to manage change.

From a management point, you have to make sure that decision makers know that they are going to pay now or pay later when it comes to technical debt. It's up to you, the change agent, to change the culture of blame which is probably the hardest thing for an individual to do. If it comes from the CEO/CFO/CIO down then it might be impossible without becoming a sacrificial lamb in the process. Tread those waters carefully but if your methods show success, the culture will change. It's also up to management to put employees out of their comfort zone from time to time. Complacency is a leadership issue for the most part. If it is an employee issue, they probably are not a fit for the team anyway.

Best practices is this nebulous idea like browser standards compliance. What's critical is that you come up with a process to manage change in software development. Note, I said software development and not software requirements. Most of us have methods for handling change in requirements, but we neglect improving the actual process of development meaning any best practice we use today will end up becoming non-best practice at some point in the future unless we manage that change.

Oh, last comment. Change agents must own the change they are striving to enact. Own it, embrace it, never apologize for it and accept the outcome good or bad. The only successful change agents I've ever seen were the people willing to lose their job for it. Change only has to outcomes. Complete success or job-costing failure. There is no testing the waters for the change agent because that level of commitment to the change is where your leadership comes from. Don't be afraid to go all in.

Why spend time coding something whiz-bang with best practices when you have to re-code it over and over again as the end goal keeps changing, or you have to scrap everything you did b/c of it, or you find out that that super-cool code you wrote which was simple and elegant is suddenly useless b/c it now has to do 15 more things that weren't clearly explained in the design spec.

I've found, having worked for a company that loves to reinvent wheels and code it's own software solutions instead of buying off-the-shelf solutions (which would get the job done better) ... that most of the inefficiencies are not programmer-based. They are due to requests from people that don't understand what they really want and / or try to force it implemented in a fashon that makes sense to them.

EG: a manager is used to having their team manually enter things. So, they ask that a program get built that will consolidate a bunch of junk, then require a person to stare at each item and manually approve each one ... EVEN THOUGH there is enough business logic that the programmers could just code the stupid thing to auto-approve things.

The manager's thought process is such...

1) Computers and software is unreliable; my comp has BSOD'ed in the past, some software around here is buggy and crashes, so we can't rely on it 100%

2) So, we need to have some human intervention ... we need a human to spot-check everything...to catch what the computer misses

3) Thus, we need the program to do some work, but then have someone full-time employed to dig the last few feet of every ditch, b/c we made the program require human intervention

The programmers thought process is...

1) The manager has to talk to the bus analyst, so we're never really able to sit down and talk directly with the manager to figure out what the hell s/he wants, instead it all being filtered through a mouth-piece that sometimes does a crap job of translating what's need into system specs, sometimes interjects their own ideas into the process to make things longer/harder than they should be, or agrees to skies-the-limit BS that they don't realize will take forever to implement.

2) We're too cheap to high usability specialist or QA folks, so the programmers are left to design UI's and QA their own work...for stuff they're NEVER going to use themselves. End result is users will complain about how this UI sucks, but management will have the end solution by then, and since it "works good enough" won't invest more time/money into it to really make it better for the end-users

3) The manager thinks all technology is out to get them, and keeps asking for powered bicycles that still require peddling to charge up the batteries instead of automated, self-driving cars that let the end-users focus more on getting work done and making real HDR task. No, they want to waste employee time still manually sifting through crap that the computer is much better and flawless at sifting through, b/c we can't break the manager of their technophobe paranoia.

4) The manager wants an excuse to hire more fucking head-count anyways, which makes them feel more important. Thus, even though we could code something that would be so efficient and awesome that it would put 1/2 their dept out of a job, they give us specs that require 10 more employees to get hired into their dept in order to baby-sit the POS they had us build.

We'd like to ue best practices, but it's had to shoe-horn best practices into one dept when the rest of the company isn't following their own best practices to do their own jobs (eg: billing, customer service, etc)

What is the relative life span of code compared with the life span of best practice? I have 30 year old code that I still use and reuse with minor modifications. (Ten or fifteen years ago I made some changes to satisfy new compiler and language specifications. C and Fortran may change, but mathematical algorithms don't.) If I am writing code today using best practice, how long will it stay best practice? For a lot of us, this is a serious problem.

(Cynically, one could argue that best practice is driven by the software guru marketplace, and if best practice stabilizes, a lot of guys will lose their lucrative lecture, book and consulting gigs. This isn't the whole truth, but it suggests that a programming group should adopt the best practice that makes the most sense in their context and for their code base rather than tracking the latest trends.)

Couple of notes: I don't want to waste time on "best practices" (overhead like building baroque test cases) unless I know a project is going somewhere. I want to invest the minimal time and effort to get a mockup sort of working until the project has some momentum. I'll do more if the project has a future. The other thing is test driven development doesn't work well with layer after layer after layer of abstraction. How do you write small test cases for J2EE web apps or Android apps? Not easy. You have to bring in the whole framwork or platform before a program can run. Otherwise, I think experienced developers generally do use the best practices they can, or that the situation warrants.

Social inertia affects software engineers too. It may be hard, if not impossible, to get an old dog to change his or her habits. That leaves the next generation to clean up the mess. But, that will take managers that understand the necessity of good coding. Otherwise, they would just be passing on their old, bad habits onto the next generation.

Start with good mangers. Then . . .

While they are still in school, reward students for adopting not only good coding practices, but for implementing sound engineering principles that avoid readily known coding pitfalls. And, quite frankly, students should be punished for not doing so. A failing grade or two will test their mettle and to see if they are up to the real world -- which we adults know is pretty damn harsh. The world is so dependent on computers, and the software they run, that a little bit of sloppiness can, one day, have enormous implications. This is no joke. Software engineering is no different than engineering a bridge -- not anymore.

We have daily meetings, do TDD or at least use a lot of unit tests, and decouple components using boost::signals, pImpl, and interfaces. And you know what? It's heinous. The code base is far more complicated than it has any right to be, and it takes forever to make simple changes.

I love the idea of unit tests to help me know when I've made a mistake, but when they triple development costs without even partially replacing the need for testers, it's not worth the cost unless lives are at risk. Some of these "best practices" come from other languages where the costs are different (especially Java) and just don't make sense in languages like C++.

So, my "barriers to adopting best practice" is mainly that they fail cost/benefit analysis, which is what guides every respectable engineer's choices.

We just don't have enough resources to keep the company afloat and completely adopt state of the art best practice. We have been able to move to a scrum-like development process, and we have on the order of a thousand functional tests, but currently I don't see how we can move to unit tests. It takes an hour or so to build our apps from scratch and 4 plus hours to run all the functional tests. We revived an old product out of retirement (started in mid-80s), and we have 4 programmer types and maybe 13 employees total in the company, and there used to be on the order of at least 10 times that many. Our customers won't allow us to stand still and change over to a modern approach and still generate revenue.

Your build system sounds... wow. I sympathize. You've got to get that under control. Long builds often encourage dirty changes (global variables, cut-and-paste coding, etc.) just to avoid rebuilding the whole system.

Martin Fowler has an excellent book, with many useful tidbits, called: Working Effectively with Legacy Code, which is on Amazon, and also available on Safari Books Online. It's got some nice advice for both legacy and non-legacy code. A number of its ideas are pretty applicable to C and C++, which aren't always easy to unit test.

The original question as worded is too vague; 20 years into my career and I can't come up with a firm list of either 'poor' or 'best' practices that would apply to every project.

But if the original poster wants to know why software generally sucks, here're my thoughts:

1. Time constraint -- management allots too little time to the design and implementation phases. (There's probably not a single software engineer out there who's been on a project that had allotted NO time for the design phase.)

2. Market Forces -- At least in the telecom industry, the company which gets their product to market first (bugs be damned) usually wins. Nortel took their lumps in weekly beat-downs from Sprint, but with all the billions Sprint had tied up in Nortel's CDMA lineup, there was no question of who owned whom.

3. Over-Reliance on Process -- Every few years someone comes out with The Grand Solution -- Concurrent Engineering, TQM, ISO-9000, Lean Six-Sigma, Agile, whatever. Each Grand Solution gets enthusiastically adopted by upper management and is pushed out to an (increasingly) cynical engineering department. Not nearly enough time, money or honest commitment is invested to do it right (see #1 above), and before you know it, it's lather-rinse-repeat.

4. Development-By-Committee -- I've been on way too many projects which might have started off with the germ of a good design idea, and watched them get diluted in design reviews which accepted virtually every comment, regardless of effect.

I'm convinced that software engineering is more like making a movie than making a bridge; if you have enough time in the schedule, enough good people on your team, and one person whose job is Director, then you have a chance. If you don't, then it's all just Brownian motion.

We just don't have enough resources to keep the company afloat and completely adopt state of the art best practice. We have been able to move to a scrum-like development process, and we have on the order of a thousand functional tests, but currently I don't see how we can move to unit tests. It takes an hour or so to build our apps from scratch and 4 plus hours to run all the functional tests. We revived an old product out of retirement (started in mid-80s), and we have 4 programmer types and maybe 13 employees total in the company, and there used to be on the order of at least 10 times that many. Our customers won't allow us to stand still and change over to a modern approach and still generate revenue.

Your build system sounds... wow. I sympathize. You've got to get that under control. Long builds often encourage dirty changes (global variables, cut-and-paste coding, etc.) just to avoid rebuilding the whole system.

Martin Fowler has an excellent book, with many useful tidbits, called: Working Effectively with Legacy Code, which is on Amazon, and also available on Safari Books Online. It's got some nice advice for both legacy and non-legacy code. A number of its ideas are pretty applicable to C and C++, which aren't always easy to unit test.

Best Practice doesn't just change over time. The very definition of what comprises "Best Practice" differs between departments and levels within an organization. Developers might have their own working definition of Best Practice, but the C-level suits are going to have their own definition, which probably minimizes the importance of things that the developers would put at the top of their list.

Functionally, "Best Practices" is a buzzword used by executives to and managers to legitimize their efforts. In a way, it works to push responsibility off onto the industry at-large. "We were following industry Best Practices!" is a handy excuse for a project falling apart.

Isn't best practice essentially "you do what you can with what's available to you" rather than you follow someone or some organization's recommended paths? So called the best practice in a larger corporate environment is a joke. The biggest hurdle is people.