JavaScript Doesn’t Need Class

JavaScript is the single most important programming language and yet, as it reaches its high point, everyone is complaining about it and there are significant efforts to replace it with something better – Go, Dart, Coffeescript. Even the people who love it seem to misunderstand it because they want to add “class”. JavaScript doesn’t need class! And if you think it does you need to look more carefully at JavaScript.

I often have conversations with JavaScript programmers and make the point that the language needs to be treated as itself rather than as a poor relation to Java or some other language. They usually agree vigorously and then go into details about how some framework or other gives them facilities just like class and inheritance – just like Java or C# or… Well you get my point, or I hope you do.

JavaScript really is a very different language.

It is one of the few mainstream languages that doesn’t simply mimic the C/C++ way of doing things. It is also quite different in terms of its treatment of objects. The problem here is that programmers tend not to spend the time to find out what these differences are. They simply notice that it doesn’t have “class” and it doesn’t seem to have inheritance and then they spend lots of time inventing ways to add them or, worse they propose an upgrade to the language that adds them.

First off it isn’t clear that “class” is the best way to work with objects, and it certainly isn’t clear that class-based inheritance is even good, let alone the best.

We are simply following the conventions of what already know when identifying these omissions in JavaScript. Class-based programming was invented to make compilers work better and to make errors detectable at compile time rather than run time.

The argument goes that in a strongly typed, class-based, langauge you can’t assign apples to oranges because they are different types. This is fine, but the real question is why any programmer would make the mistake of assinging apples to oranges; and what is it about the way apples behave that make this even thinkable. Is it that perhaps apples have a method in common with oranges that suggest that they can be used in the same way? In which case perhaps the language should make an attempt to work out what the programmer intended rather than just flagging a type error.