That's because Math.max and Math.min don't return the biggest and smallest numbers possible as this seems to imply (that's what Number.MAX_VALUE, Number.MAX_SAFE_INTEGER, and its MIN variants are for.)

They return the max and min of the numeric arguments you give it, e.g. Math.max(1,3,2) === 3. And as anyone who's ever implemented a max function before knows, a default value of -Infinity for max actually does make some sense.

Hopefully small numbers make more sense.

js> 0.1+0.2==0.3
false

This will be true in any language in which expressions like 0.1 are floating point numbers. Not unique to JavaScript at all.

The type of not a number is, well, a number. Sensible.

js> typeof NaN
"number"

Again... NaN is a floating point value. This is going to be true in other languages too. For example in Ruby:

> Float::NAN.is_a?(Numeric)
=> true

Yes it's funny that something called Not-a-Number is a well-defined floating point number, but welcome to computer science.

That's because Math.max and Math.min don't return the biggest and smallest numbers possible as this seems to imply (that's what Number.MAX_VALUE, Number.MAX_SAFE_INTEGER, and its MIN variants are for.)

The minimum of an empty set is defined as +∞ since min(x, +∞) === x for all x; conversely the maximum of an empty set is defined as -∞. If you check you'll confirm that Math.min() returns Number.POSITIVE_INFINITY and Math.max() returns Number.NEGATIVE_INFINITY as expected.

The language doesn't specify that the numbers are backed by IEEE754, but in practice IEEE754 is the overwhelming dominant implementation of floating point real numbers. Every every every engineer and developer in the industry needs to know 754's limitations.

A lot of these are implicit conversions, to string but also to numeric from boolean. JavaScript tries to always do something if you give it an expression, even if it doesn't make sense.

However, what I still find confusing is the whole object wrapper thing. I think the "string" instanceof String is because of that. String("string") instanceof String probably would work. This one I think is just plain nuts.

Ow and the NaN === NaN is just dumb. This is specified in the IEEE standards and every programming language should give the same outcome. Instead he should have done something with signed zero (I forgot what the rules were, but these are maybe the weirdest ones).

JavaScript strings are primitive types and not objects therefore it does not have a prototype that matches String.prototype which is how instanceof works. However, it will readily convert strings into String objects. You can actually capture and output the result of the coercion by adding a function to String.prototype that just console.logs "this".

Most of javascript's quirks come from implicit type coercion. Any time you do an operation on 2 different types, JavaScript will always convert them to the same type leading to strange things of you don't understand how implicit type casting works in JavaScript.

One notable example given in the article:

>true+true===2

The + operator on a bool implicitly converts the result to an int. true converter to an int is 1 and 1 + 1 === 2. Same with subtraction, 1 - 1 === 0. That is fine and reasonable assuming implicit type coercion.

> true === 1

However this being false makes sense because === also checks for type equality and does no implicit type coercions. bool != int so this makes sense to be false.

Many other operations like adding lists together result in coercion into a string type. Same with comparing weak equality between an array and a string, the array will get converted into a comma separated string.

So yeah, these things while not making logical sense, are just a artifact of the type coercion system of JavaScript, but once you know and understand the semantics of its type coercions, all these silly quicks make sense and you can really see how obscenely simple the typing system is where in the end of the day, under the hood, everything is either a simple string or a float in JavaScript.

He didn't write the correct comparison which would have been x!==x, which returns false. He wrote x==!x which is the same as (x) == (!x). He's performing a not operation on the x. Obviously (x) !== (not x) returns true. He's taking a true and then nullifying it twice.

He also plays around with === and == which is confusing to a novice but by design.

I stopped reading after that. It seems like he's trying to trick the reader rather than teach fun quirks.

He didn't write the correct comparison which would have been x!==x, which returns false. He wrote x==!x which is the same as (x) == (!x). He's performing a not operation on the x. Obviously (x) !== (not x) returns true. He's taking a true and then nullifying it twice.

I'm guessing it looks at the !x, decides that we're talking about the array instance because it doesn't make sense for us to to do !x when we mean !x[0] but when we do a comparison it thinks we want the element because there's only one and javascript sometimes infers with one element arrays that you want the element, not the array.

So the array instance is a truthy and the 0 is a falsy. Nullify the truthy and you have falsy == falsy.

I'm gonna hang my head in shame and then read the rest of that article because that's pretty cool.

I am not saying this is the "right" behavior, as different languages/libraries make different choices and we just get used to them.

I just find raising a TypeError odd because no other Array method does, and javascript will happily let you call last/first/slice/join or get the nth element of an empty array, as well as let you call join on it.

Compare with, e.g.

python, which will raise an error if you reduce an empty list, but also if you try to get the nth element

ruby, which will return nil if you reduce, but also nil if you access the nth element

I hate javascript. Yes, it will run everywhere and all browsers support it. But man couldn't they been a little smarter and added some structure to it? If they did that than working with it would be way easier.

I'm a Linux guy but I think Microsof's Typescript is an answer to that. Syntax is future Javascript standard, with modules, imports, static typing and everything. And out you get normal readable Javascript. Works everywhere where nodejs works, because compiler is in Javascript. There exists addons for type completion for VS, Eclipse, Vim...

But man couldn't they been a little smarter and added some structure to it?

When I learned JavaScript (for reals), I spent literally hours desperately searching for a way to import files. Even CSS lets you do that. I thought I must be doing something really stupid, not using the right words, or something like that. I couldn't believe that there isn't a built-in way to do that.

When I learned JavaScript (for reals), I spent literally hours desperately searching for a way to import files. Even CSS lets you do that. I thought I must be doing something really stupid, not using the right words, or something like that. I couldn't believe that there isn't a built-in way to do that.

This is why there's such a big clusterfuck ecosystem of module systems for JavaScript. People (including myself) are starting to use ES6 modules, which can be compiled down to existing module systems, but this has its issues too.