The main problem with appealing to "best practices" is this: One man's best practice is another man's anti-pattern.

Between semicolon-free folks and JavaScript traditionalists, who gets to play the role of the expert? If Node.js is popular, does that mean that Isaac's comma-first style is "correct"? (https://gist.github.com/357981)

In the absence of a quantitative engineering method with which to evaluate either approach, isn't it a purely personal and political choice?

"Between semicolon-free folks and JavaScript traditionalists, who gets to play the role of the expert?"

You're framing this as a disagreement between equals. It is not. There are degrees of expertise. (Total lines of production code * amount of usage) is a reasonable approximation, and those who rank highest on this scale are overwhelmingly in favor of requiring semicolons.

If Carmack and Torvalds were in agreement on a C best practice while Odesker^W^W^W^W^W^W^W Topcoder user #12571 said to do something else, I wouldn't go with the latter.

You can think about the effect of coding style instead of letting the experts think for you.

If you are doing a Topcoder round, then the random Topcoder user's advice is probably better than Carmack's. Carmack or Torvalds would probably tell you to use a variable name that gives a lot of information about its purpose, so that other people will find it easier to read your code. The Topcoder user might tell you to use a cryptic abbreviation that helps you remember a variable's purpose but clues little to the outside reader, since Topcoder users are penalized for allowing others to find their bugs. When implementing an algorithm, Carmack or Torvalds would code it as part of an extensible system with an API so that it can be used for many purposes. The Topcoder user might code it with no potential for reuse, but would do it in the way easiest to code quickly without making a mistake.

"You can think about the effect of coding style instead of letting the experts think for you."

Right, because what could those experts possibly have to teach you? Fuck that intellectual crap, brogrammers are here to crank out some code yo! Just because those dinosaurs have spent "thousands of hours" studying and trying out different programming styles and concepts means I should listen to them? This is the internet age man. Some guy in a github comment told me different and github is cool.

It's like those functional programming professors who try and talk about how continuation passing style was a bad road to go down because of maintenance problems and how difficult refactoring and just reasoning about code becomes in large systems.

Fuck that egghead noise, node.js has callbacks and node is how you get webscale.

You completely and utterly missed the point, in addition to just being needlessly rude.

The point of your parent was "don't let the experts think for you", not "screw the experts" -
that's a very important distinction. You should form your own opinions, not just uncritically
adopt the opinion of some expert. It's unquestionable that expert knowledge and experience is
important - but nothing beats forming your own (hopefully informed) opinion.

Who said "let the experts think for you"? No one. You were arguing against "give the experts' opinions more weight than others, often a lot more"

"You should form your own opinions, not just uncritically adopt the opinion of some expert."

False dichotomy again, there is a lot of ground between "uncritically adopt the opinion of some expert" and "forming your own opinion and hoping it's informed".

You should find that middle ground, it's where all the smart people are.

I didn't miss the point at all, I just completely disagree with you. I don't think it was needlessly rude either, it was sarcastic. And I feel no need to treat people who are promoting anti-intellectualism with kid gloves.

Intellectualism is not about showing off how much you agree with authorities. Intellectualism is about thinking and understanding, eventually until you become one.

When discussing coding style, you can quote experts. Or you can understand their arguments, understand the impact and tradeoffs of coding decisions, and realize why they are true. And in the case of Topcoder, about as far as possible from building large systems, you can realize that the optimum style changes and understand why.

I think there's a broad trend to treat code as magic, something where anything can happen. Most people learn coding styles by being given suggestions, and then noticing from experience that they like it better. If you reject this mindset and believe that code is orderly, then this advice changes from Wisdom of the Elders (TM) into an engineering calculation. I specialize in automated restructuring of software architecture, so this mindset is something I try hard to fight against.

I don't see how this applies to me taking issue with the parents "figure it out yourself instead of letting the experts think for you" statement.

"Intellectualism is not about showing off how much you agree with authorities"

My point was that intellectualism is not about agreeing with authorities but giving proper weight to the opinions of intellectuals in the field when weighed against lay opinions or smaller groups of intellectuals.

You certainly don't seem to be arguing for the post-modern interpretation I was arguing against where expert opinion is considered "just another opinion, equally valid" so I'm confused why you think my stance is almost the opposite of what I was stating.

The opinion of an intellectual is indeed stronger Bayesian evidence than that of a layperson. But if they have an opinion, they should provide an argument to back it up, and that argument is far more important.

My first comment hit the advice to always follow Carmack over the random Topcoder user. It's important to understand what makes a best practice a best practice rather than just listening. A Topcoder round and Quake differ greatly enough that the random Topcoder user is sometimes the greater authority.

You seemed to map the call to critically evaluate expert opinion to a call to ignore it, something I'm sure you've seen too much of. But it's important to understand the difference between critical evaluation and anti-intellectualism. Indeed, the former is the higher form of respect.

I'm not sure how well it applies to a subject like best practices where it's either subjective or at least fuzzy. The arguments are "in my experience this causes problems when working with other people and is more prone to errors long term". There have been no counter examples or analysis I've seen around this issue, just a "that's just what the traditionalists say" type argument. Something along the lines of your topcoder example that applied to the semicolon argument (god i hate this issue) would be very welcome.

Any critical analysis here is hard to do objectively, and there is a consensus of experts (including Eich) on one side. If i'd seen any actual critical analysis rather than a jump to "don't let the experts tell you what to do! be a rockstar!" then I wouldn't been so fast to use the anti-intellectualism card. But it's so prevalent in the node community (more than Ruby's but less than PHP's) that it's depressing.

The reason for using semicolons in JS is quite simple: It makes it easy to determine the boundaries between statements, both for humans and machines.

With semicolons, the search for the boundary between statements reduces to the search for that single character -- that means fast code skimming.

The boundary is also robust against changes; as long as that semicolon stays in place, nothing that happens within that statement can cause that boundary to become an interior.

There is an exception: Strings. This is no trouble for machines, as strings are processed by the lexer. Humans can't instantly take a string and process it as a chunk, but a UI innovation does this work for us: syntax highlighting.

As an example, consider the code:

x = a && b
foo()

If I try to change the "b", I might accidentally turn the code into

x=a &&
foo()

The grammar allows the malformed statement to interact with its neighbor, and thus, instead of getting a parser error, I'll get a program bug.

LR parsers take advantage of this for error recovery, allowing them to find more than one parse error at a time.

A more sophisticated view is to think about syntax abstractions. Given a program like "x=y; return x;" we can create a program written in the abstract syntax "statement; statement". (The dual operation is to concretized that back into the set of all two-statement programs. This kind of relationship is known a Galois connection.) Having semicolons makes the abstract syntax unambiguous, allowing us to efficiently reason at the statement level.

Brandon Eich on twitter (or in comments here) will also say that it is his opinion, and he recognizes that it seems to work pretty well for the NPM guys, but maybe it is just because they are such great devs, he doesn't see it scaling down that well.

And finally, Crockfords reasoning about things like this seems based on the presumption that js developers are drooling idiots. For example, he recommends against using the new keyword for object creation in javascript, even though object factories can often be a huge waste of memory, because dumb javascript developers may forget to use the new keyword at times when they want to instantiate objects.

I am not saying you shouldn't listen to smart people, but I am saying that you should actually learn the reasons they say the things they do, and use your own brain when making choices.

The root problem is that we still don't know how to disagree when two sides look at the same set of data and believe two different things. We allow ourselves to descend into tribalism and assume the worst of each other.

If you look at the language Tom objects to the most, it's language Isaac uses that promotes this tribal attitude that the other side doesn't have your best interests at heart, and questions their motives.

We're just terrible at this, and I'm not sure any human society has ever really cracked it.

I completely disagree. We absolutely know how to resolve these kinds of disputes and it's very accurate (not as accurate as Science, but better than everything else). See my other comment here (http://news.ycombinator.com/item?id=3922527).

The tribal attitude is the enemy of this process for sure, and humans are very prone to that attitude but that doesn't mean you can't still find the answer more likely to be right.

I think using comma-first syntax as a counter example is disingenuous. Whether you're a novice programmer or an expert programmer, using comma-first or comma-last isn't going to bite you in a way that isn't immediately solvable.

Omitting semi-colons, however, can cause novice JavaScript programmers serious grief. Nevermind novice JavaScript programmers – they cause me grief, and I put semi-colons everywhere! That's because I'm often called in to debug our customers websites, and on several such occasions I've had to fix bugs that never would have occurred if that customer used semi-colons properly.

Sure, if our customers understood JavaScript's ASI and how to use semi-colons "properly", I wouldn't have this problem either. Except, as Tom points out, the reality is that almost none of these people will invest in doing so.

Whether you're a novice programmer or an expert programmer, using comma-first or comma-last isn't going to bite you in a way that isn't immediately solvable.

Really?

The comma first style makes it less likely that your list will end with a trailing comma, which can cause IE-specific bugs. If you're trying to track down the IE breakage a couple of weeks after you wrote down the code, it can be fun to track down the extra comma. If you don't know that this bug is possible (as I'm guessing you don't), multiply the hair-pulling by 10.

With the comma first style, that bug would never have been introduced. Neat, huh?

If you run any syntax checking or minification tools, this will be immediately obvious. If you test your site in IE at all, this will be immediately obvious. If you record JS errors on your site, this will soon be obvious. If you record any kind of usage statistic, this will eventually be obvious.

Take it from someone who has experimented with cutesy formatting in the past[1]: you don't need to use cutesy formatting to solve this problem.

[1 | You can end class declarations with _:0} instead of a bare } to avoid the trailing comma issue. ]

Two of the tools that I mentioned are in the "prevention" category: syntax checkers and minifiers. The full enumeration is just a sanity check. If coding style is the only thing keeping your product from being broken, you're doing something else quite wrong.

Indeed the link I gave demonstrates how the right IDE also solves this problem. The fact that there are many ways of solving the problem does not mean that it isn't a real problem that formatting can help address.

if comma first confuses a dev more then half a second or so, they really aren't that good. First time I saw comma first was from a DBA about 10 years ago who swore by it, and it has been my go to technique for writing maintainable code in languages where you end up with huge comma separated lists.

I freely remove, add, and move things to the bottom of my js classes without ever hitting trailing semis, or hitting a missing semi. I think that is more maintainable then the alternative

Visually? Finding the missing semi-colon. It should be missing from at most one line out of a hundred, and at the beginning of it, very easy to spot - it's a visible wart, as opposed to looking for a needle in a haystack.

Stylistically, I side with the no semicolon people. Grammatically they're redundant.

Pragmatically, if you ever find yourself saying "read this enormous, hard to read spec just so you can identify the three or four edge cases that will burn (yet remain statistically likely to commit)" you are on the losing side of history.

Basically, it's preposterous to have reasonably smart people seriously argue that the option that is least user friendly and most likely to cause pain down the road is the best one out of a sense of misguided spec puritanism.

"In the absence of a quantitative engineering method with which to evaluate either approach, isn't it a purely personal and political choice?"

No.

Here's one idea: perhaps everyone's opinion is not equal on every subject. Perhaps by looking at the opinions of respected accomplished people who have garnered the respect of other respected accomplished people you can being to see who's opinion should get more weight. This is the process that came up with the very ideas of things like "engineering" and "quantitative".

I know we love the young upstarts disrupt the old way of thinking meme. That's why you framed it this in that way with the folks vs the traditionalists. You might want to remember that historically the young upstarts have been wrong 99% of the time. That's why it's news when they are right.

"One man's best practice is another man's anti-pattern"

And in the vast majority of cases, when you ask that question about a specific problem: One of those men is wrong. Even if they're both smart. But how to tell? If a huge number of other smart and experienced people have been convinced by one of them ... well I'd go with that guy. You won't be right all the time but it's the best system we've come up with so far for things that don't conform to double blind placebo controlled studies or mathematical proofs. It literally is the best way to be right most often.

>And in the vast majority of cases, when you ask that question about a specific problem: One of those men is wrong. Even if they're both smart. But how to tell? If a huge number of other smart and experienced people have been convinced by one of them ... well I'd go with that guy. You won't be right all the time but it's the best system we've come up with so far for things that don't conform to double blind placebo controlled studies or mathematical proofs. It literally is the best way to be right most often.

This reminds me a lot of abstraction in software systems. For 95% of cases, the abstractions make things quicker in the end. For those 5% of edge cases which the abstractions don't cover properly, you spend inordinate amounts of time trying to get things to fit. The bigger point here is that trying to white-wash anything with a particular methodology is both the right and wrong thing to do. It's right because most of the time it will work, it's wrong because when it doesn't work, you will have a hell of a time dealing with it.

The best system we have come up with so far is actually to take things on a case-by-case basis instead of trying to solve everything in one fell swoop. The problem is that one must be mindful as consistently as possible to achieve this. Most people are not capable of keeping that line of focus for an hour, let alone a day or their whole lives. Funnily enough, the solution to this is meditation, not a set of best practices or expert opinions.

I agree, but you're looking at this at a lower level of abstraction than I was. As you go from case to case you need some kind of methodology or general principles or best practices to guide you, you don't start from "I think therefore I am" for every problem.

And yet, somehow we've created science and a pretty large civilization by being able to argue points rationally and logically and not always being convinced solely by rhetoric. So my counterpoint to your "it does not reliably answer..." is the fact that we are having this argument on the fucking internet.

That's why this system relies on convincing smart, educated people and not just people in general. The need for intellectual discourse to avoid this exact problem and what that looks like is taught as part of being educated.

Separate vars is what I prefer to do, but JSLint actually recommends that you change multiple var statements to a single var with variables separated by commas (and with a semicolon at the end of course!). (And also all var declarations at the top of the function...).

I use leading commas in any list that is not one liner in C++. This way the code looks pretty, and that is way more important that code that reads like english. :) (I shudder from dread every time I look at applescript...)

That example gist went a long way towards convincing me of isaacs' comma prefix style.

What convinced you? I read it and saw something wildly different, which is best reserved for a good reason. The reason appears to be visual recognition of delimiter mistakes that the parser will catch anyway.

The gist contains examples of various mistakes in both styles. Most of the errors are syntax error which will be immediately rejected by a parser. While the comma first errors are generally more obvious, they are silently bad by unexpectedly returning undefined. The standard "var" example is silently bad by leaking vars into the global scope when chaining initializers on a single "var". My conclusion from these samples is that comma prefix formatting only practically helps to prevent hidden mistakes in chained "var"s. Putting each variable declaration on a line of its own is far less distracting than reformatting every list and object.

Sample comma first style error. This seems like a strange thing to do, but I'm unfamiliar with this style.

Who gets to play the role of expert? Presumably the "leaders in this language community" who "have given [everyone] lies and fear", according to Isaac's referenced blog post.

In the absence of a true quantitative engineering method, there is a middle ground other than personal choice: common anecdotes. Presumably, those shameful leaders have heard enough stories about the issue to make a reasonable recommendation, sort of like advising against shipping server software with an open default security strategy even though I don't have any peer-reviewed studies saying such is unwise.

"Between semicolon-free folks and JavaScript traditionalists, who gets to play the role of the expert? If Node.js is popular, does that mean that Isaac's comma-first style is "correct"? (https://gist.github.com/357981)

No. Isaac doesn't define what style node.js is written in. And in an interview when he took on the lead developer role on he affirmed that he would abide by their current coding preference, which wasn't comma-first. That's Isaac in a nutshell, he works with the developers around him.

You may be mistaking Node.js with npm. npm is Isaac's code, and that is comma-first I believe. It's his project, and his rules, and his call as to what coding style he wants. People who participate in his projects either accept that, or walk away. Node.js is an existing project, so the existing coding preferences are in place -- unless, I guess, there's a general consensus that the coding preferences in place are causing an issue.

It's surprising that we haven't managed to create languages/editors which can break code down into atoms which can be presented however the developer working on said code likes. E.g. if a segment of code was tokenized properly and the editor could manipulate those tokens, coding style issues would become a thing of the past.

If you can express your style in a way a code-formatting tool can apply, then you can use checkout and commit hooks in your vcs to apply the formatting. Code in the repo gets a team-standard formatting, and code in your working copy gets your personal formatting. You just have to make sure the whole team uses the commit hook, at least. (Or in most systems you can apply that hook on your central/official repo.)

The only problem with this approach is the diffs between the repo and your working copy.

You can't solve everything with numbers, and I say this as someone who believes very strongly in data-driven decision-making. Even given perfect data as far as what practices are more effective, you still have the problem of interpreting it. This adds its own layer of purely personal and political choices.

My advice? Don't work with people whose best practices are your anti-patterns. Or at least strive to work with people who agree with you on a core set of principles. Or hell, work with people who agree with you on nothing. Just don't complain when you can't work with any of them. :-)

And besides that, what's wrong with "purely personal and political choice[s]"? Being that I'm the person who's writing my code, I'm allowed to make a certain number of purely personal choices. That is, as long as I take into account the political consequences of doing so. Put another way, it's ok to have personal preferences, but you have to take other peoples' feelings into account too.

The problem here is viewing software development as some sort of monolithic phenomenon. "Software development" is a crude moniker that covers an incredibly wide variety of efforts, varying in scale, detail, and significance across a much wider range than anything else we consider to be a single discipline. "Best practices" for development of an iphone game may not be the same as for flight software for an orbital rocket or for an international banking backbone.

Edit: to use an analogy, imagine if all vehicle development was treated with the same terminology. Whether you're building helicopters, spacecraft, mass produced commuter cars, RC toys, nuclear powered submarines, formula 1 race cars, or tunnel boring machines: all of it is just vehicle manufacturing right? Just as anyone would find it ridiculous to put forward advice that could possibly apply across all of those distinctly different disciplines we should view advice about software with the same skepticism.

The problem is that the industry, as a whole, is still extremely young, and simply hasn't had the time to develop a standard set of good, let alone best, practices. Tools and processes are changing at a rapid rate, and it will simply take time for best practices to sort themselves out.

>"Best practices" for development of an iphone game may not be the same as for flight software for an orbital rocket or for an international banking backbone.

I disagree. There are a few things that should be common across all of these, like not storing passwords in plain text.

JavaScript and semicolon controversy aside - I'm always happy when I read a post written in this style. It's clear (at least to me) that the author has taken the time to think about what they're saying, and they present clear arguments. I always hope that more people write this way; in the style of what I feel is real discussion.

First, Isaac Schlueter isn't advocating against best practices with ASI. He's advocating for an alternate best practice that you disagree with. In that very article he suggests an alternate place to put the necessary semicolons.

Second, the CouchDB thing isn't because of a lack of best practices, it's because the wrong best practices were misapplied. And in the end, its use of best practices prevented it from severely hurting CouchDB's reputation.

The best practice is that password data should be stored as a salted hash, so if the database is compromised, the original password can't be retrieved. It's rare that the salted hash is intended to be public information, and that's what the CouchDB people did. What they should have done is realized that since they're deliberately sharing the hashed passwords, the original best practice doesn't apply, and they need to break it down and reexamine it (probably a good idea anyway for a project of its size). And they should have reached the conclusion that SHA1 is much too weak and/or that the salted passwords shouldn't be shared. In the new version, CouchDB 1.2, where an effort to correct the problem was made, the passwords aren't public. But at least, aside from the encryption strength, they got the hashing right, in that they used salts so rainbow tables can't be used. SHA-1 is easy to brute force for simple passwords, but as passwords get longer and contain more than dictionary words, it gets harder to brute force. It's easy to communicate that they screwed up, and easy to communicate that it's no worse than Sony password databases that were compromised, but the truth lies somewhere between the two, and can be seen by carefully considering the details of the case.

> The best practice is that password data should be stored as a salted hash, so if the database is compromised, the original password can't be retrieved.

The best practice is to use a purpose-built password algorithm like bcrypt. Algorithms like SHA1 and MD5 are designed to run fast; even with salts they can be cracked by massively parallel hardware like a room full of GPUz.

Also relevant: Most best practices exist without explanation. We are supposed to just use it without question. I think that's what Issac meant with "fears and lies" (not that I agree with his statement). Raymond Chen puts it great:

Good advice comes with a rationale so you can tell when it becomes bad advice. If you don't understanding why something should be done, then you've fallen into the trap of cargo cult programming, and you'll keep doing it even when it's no longer necessary or even becomes deleterious.

The real question is why anyone takes obviously wrong artifacts in JavaScript like omitting semicolons and flagrantly wrongly designed this binding seriously. This isn't an issue of best practices or personal preferences, it's aspects of a ubiquitous language that are simply and unequivocally broken.

This just begs the question: they're broken, because they're broken? And anybody who thinks the features neat is wrong?

The reason people talk about omitting semi-colons in JavaScript is because they don't think that feature is unusable--they clearly think it's better than adding semi-colons everywhere! When reasonable people disagree like that, the situation is rarely simple and clear-cut.

Now, perhaps you could reasonably argue that the behavior is unfortunate. But you can also reasonably argue the converse. And, most importantly, it isn't immediately clear who is correct.

It has always been a mystery to me that reasonable people somehow think it is better to omit the semicolons where you can instead of just being 100% consistent by using them to terminate every statement. I have always thought that computers, programming, logic were always all about favouring consistency and simplicity. My mind boggles that otherwise talented people do not see the value in using simpler rules when they work and avoiding exploiting complex things that seem to offer somewhere between little and no extra value.

I guess I just don't get it because it has always seemed precisely a simple and clear-cut thing to me but somehow it is not.

Yes, they're wrong, because they make certain classes of errors impossible to catch automatically, and place that much heavier a burden on me to use the tool. You can do what you like in private, but forcing me to waste my mental energy because it makes you feel good about yourself to be able to manage lots of complicated trivia? Should I be accepting if you wanted me to use a hammer with a spike on the side of the handle?

I consider myself a beginner in Javascript and did not find Isaac's post the least condescending. I actually learned a few things and am very glad he took the time to explain how things work. Because honestly, I won't read the ECMAScript spec unless I need to.

I'm actually one of the people who never use semi-colons to terminate statements. That's for personal projects. Where I work, we (as a team) decided to use them. And that's fine, but I prefer not to use them.

If I could I would up-vote this post twice at least! And it is not the Java, node.js, where-do-you-put-your-semicolon part of it. I don't undersand any of it. No, it is more the view on best practices. And there, I completely agree.

I'm part of HN not for very long, but what strikes me are the similarities between supply chain management and programming in some aspects. The purely technical issues are out of scope here since I cannot judge them, it's more about the principles behind. Like in this case here.

Best pratices provide a guideline and are condensed experience. They are by no means the only or best solutions, rather ones that can be applied in a lot of cases withou screwing up. That said, one should also reach a point where understand them. One has to in order to understand when to use them, most people get at least to that point.

What you actually have to do is reach an understanding of not only when to use best practices but one of knowing why you use them. This is not true mastery, but it will get you far enough.

Ultimately, you need people who know when to use a best pratice, when to ignore them and when to actually brek conventional wisdom. What I see a lot lately is people doing the last to bits without having understood the best pratice's when and why in the first place.

For me, that's the key message of this post, and this point is valid in every industry, sector or task at hand.

>By couching it in these terms, it implies that anyone who follows best practices has given in to “lies and fear!” Who wants to be swayed by that?

That is some fairly brazen rhetorical sleight of hand. Schlueter is obviously not attacking the following of best practices. He is saying that semicolons-everywhere isn't a best practice and he gives an argument that the most common rationales given for the practice do not hold water. There are a few flaws with his argument, but what it certainly is not is a general condemnation of the following of actual, bonafide best practices.

Pretending otherwise is akin to accusing someone of "siding with the terrorists" because they argue that people should be allowed to take bottled water on airplanes.

Having too many semicolons will never ever cause you a problem. Having too few almost certainly will. Why is this an argument? Don't we have enough bugs to deal with? just use the semicolons, or get out of my way.

Most arguments about style are almost certainly pointless, but if you are faced with a choice between two styles, one of which may cause a problem sometimes if you don't know what you are doing, and the other style, which will never ever cause you a problem, it seems the choice is kind of obvious, no? Particularly since the guy doesn't really make a case for what his objection to semicolons actually is to begin with. He doesn't like the look of them?

Many best practices work well because they mean you don't ever have to think about uninteresting problems (like ASI) ever again. I know the rules of ASI but I honestly have never had to think about them because, I don't need to; I just put a semicolon at the end of my lines.

If you ever have to do network and systems administrations you will come across loads of best practices because needing to memorize every last rule and edge case of all the software on your systems would be truly maddening.

That said, it is always good to know the why behind best practices and not follow them blindly.

Best practice is about a practice that helps create maintainable code. Code is maintained by people. So a best practice is really about getting the best code delivered by the people you have in the most appropriate time possible, and code that can be maintained by those same group of people and others who may be brought into the project at a later date.

Insisting on semicolons means that the bar to entry for being able to develop and maintain a JavaScript code base is lower. Also, we can spend more time thinking about the purpose of the code rather than it's syntax. That helps when it comes to debugging, spending more time thinking about what the code is trying to do rather than losing time figuring out whether the syntax without semicolons is causing an unintended side effect.

The lack of semicolons argument reminds me of the Python versus Java argument. Choosing python gives you a much smaller range of people who could work on your project, but on average the quality of that developer would be better than if you opted for Java.

The advantage of the non-semicolon stance is that they are increasing the minimum level of competancy before someone can work on their project. The disadvantage of the non-semicolon stance is that they are increasing the minimum level of competancy before someone can work on their project.