I've long had an aversion to the phrase "Best Practices". As I started out writing software, I looked to my mentors and industry leaders to inform my decisions about how things should be done. If they shared their "Best Practices", I took them to heart. I wanted to be the best coder I could possibly be.

Hungarian Notation is a Best Practice?

For years, I used Hungarian Notation. This was a practice established in the late 70s and later promoted by the likes of Steve McConnell in Code Complete, first published in 1993. When writing in FoxPro or versions of BASIC, this proved helpful. The languages had limited concepts of type and the mnemonic pre-cursors on variables helped me be a responsible citizen and steward of the code. I doggedly followed this "Best Practice" and encouraged my co-workers to do the same.

I then started writing more C/C++ code. I continued the practice, knowing it was part of what made me a better developer. But I lost sight of why this practice was previously of value to me. So I was not only explicitly declaring a type, but I was then naming the variable to indicate type, while working in an environment where the compiler enforced type. The naming convention didn't serve its intended purpose in this context, and yet I continued the practice for years.

In Clean Code, Uncle Bob tells us not to use Hungarian Notation, but Joel Spolsky recommends it. 20+ years later and the debate over whether or not this is "Best Practice" endures.

Look deeper at the debate. Consider what Steve, Bob, and Joel each say. They provide a set of objectives in a given context and then show how what they've chosen to do is optimal for them in that context. That doesn't indicate a universal best. It's contextual and it's predominantly subjective.

Uncle Bob tells us Hungarian Notation makes code harder to read. Joel tells us it makes code easier to read. These are subjective evaluations. I doubt either of them has run a comprehensive double-blind study an a large population to ascertain if Hungarian Notation is statistically more or less readable. My apologies to them if they have.

So...Which of them is wrong?Are any of them right?Which pundit's pontification of programming proficiency is predominantly preferred?

These are useless questions.

Find Your Own Best Practices

Here is a useful question, "What could we change for the better?"

I'm not saying we change all the things. And certainly not all at once. But I am saying that every team should be looking at their approach, questioning their perceptions, and putting their "best practices" to the test. And if you decide to implement something because it is a "Best Practice" treat it no differently than some "wild idea" you came up with in a retrospective.

Measure what you're optimizing for. Measure what else is important and might be adversely impacted. Make a change and see what happens. Adjust, adapt, learn, grow.