My argument against JavaScript would be everything regarding “this” and what it points to at any given time.
My gripe with C/C++ would be its lack of a sensible string library though my biggest gripe is a meta grudge against OpenGL and of its render pipeline goodness.

In general though I think the intrinsic problem is the fact that these languages are built by engineers/computer scientists, for engineers/compSci. Being in a field that requires very end user friendly systems, coding in these very overly convoluted languages full of little “gotchas” and edge rules has given me no small dose of irony.

JavaScript is a b*tch for maths indeed. That error over there is what ACTUALLY HAPPENS, it’s so frustrating xD
Not sure about the PHP not a bug over there tho, never came across that thing.
And I still haven’t learned Java.
I’m so proud of myself for understand ebough to laugh some more at this trip.

Or F#. Admittedly, it has occasional insanity of its own, but that’s usually a deliberate trade-off to remain highly compatible and connected with the CLI and its libraries, or mix in imperative code where the compiler’s ability to optimize declarative code doesn’t suffice yet.

That ruby code doesn’t compile. In fact that joke isn’t really funny considering most ruby developers are all about writing elegant beautiful code. (Perl on the other hand… looks the same both encrypted and unencrypted).

If you’re going to make fun of Ruby you are better off finding the bitchfests on github about whether or not to use inject or reduce (same function).

@ bloodycelt:
Ruby code does compile, does what it is intended to do (yes, I’ve just checked), does look out-of-this-world, and will cause your PM to haunt you till the end of times if you commit anything like this to the project repository. Check this gist: https://gist.github.com/qrohlf/7045823

There are many other valid reasons to be mad at Javascript but that ain’t really one. You might rather ask why the coder is using a string for their calculations (i.e. didn’t make sure they’re dealing with numbers first, relying on implicit casting).

Everything wrong with Haskell in a single sentence: “Functional programming combines the flexibility and power of abstract mathematics with the intuitive clarity of abstract mathematics.” ;-P ( https://xkcd.com/1312/ )

PHP:
Learned that back in the day. It is fun.
Until you realise that we invented strong typing for some very good reasons.
And that a Langauge for dynamic websites without proper Unicode support is pretty useless.

Javascript:
Not used it before, but I give it a try.
ASCII code for ‘5’ is 52. The one for 3 is 50.
It treats ” ‘5’ -3 ” as ” ‘5’ – ‘3’ ” or “52-50”, thus 52-50+3 = 5
No idea how it is screwing up all logic to get to 50 in teh first case however (that propably needs something like multiplication/division isntead of addition/substraction; Propably both).

Ruby:
Again something I have not used before.
I think you are creating an array of integers, then transform it into char and join them together?
Indeed that looks like you translated the Brainfuck “Hello World” programm to ruby.

Java:
I am rusty with Java, but C#/.NET programming is pretty similar in the end.
I know the Factory pattern and the singleton pattern.
I know that every singleton must use a Factory pattern (but not vice versa).
Abstract/Superclass is pretty clear.
Proxy in this context I have to look up.
And no idea what java beans are and how they need any of the above.

My argument against JavaScript would be everything regarding “this” and what it points to at any given time.

Yeah, `this` in JavaScript is pretty weird, but even if it’s not intuitive at least it’s easy to understand. `this` within a function, unless explicitly bound to something else, will always point to whatever object you called the function on. So if you call `object.func()`, `this` inside func will point to object. That’s it; simple.

In contrast, the rules for automatic type conversion in JavaScript are completely insane and not easy to learn without memorizing a whole list of rules for precedence and the like.

Javascript:
Not used it before, but I give it a try.
ASCII code for ‘5’ is 52. The one for 3 is 50.
It treats ” ‘5’ -3 ” as ” ‘5’ – ‘3’ ” or “52-50”, thus 52-50+3 = 5
No idea how it is screwing up all logic to get to 50 in teh first case however (that propably needs something like multiplication/division isntead of addition/substraction; Propably both).

Ruby:
Again something I have not used before.
I think you are creating an array of integers, then transform it into char and join them together?
Indeed that looks like you translated the Brainfuck “Hello World” programm to ruby.

She’s using non-printable characters as variable names. Ruby supports full Unicode in its source code. Honestly this isn’t a real problem with Ruby, just the way she’s using it. Probably could have come up with a better reason to hate Ruby.

@ Oleg Oshmyan (Chortos‑2):
Not if you have a Unicode non-breaking space as a variable name.

Ah! I did copy the program from the gist, but I should’ve known better than to select all and copy from the Web browser.

I feel ashamed a bit; this isn’t my first time seeing non-breaking spaces abused, and I have abused them myself—not in my source code, mind—and I generally use them (and all sorts of Unicode characters) in writing.