Posted
by
timothy
on Saturday May 07, 2011 @06:44AM
from the browser-as-universe dept.

mikejuk writes "JavaScript is currently an important language — possibly the most important of all the languages at this point in time. So an impromptu talk at JSConf given by the creator of JavaScript, Brendan Eich, is not something to ignore. He seems to be a worried about the way committees define languages and wants ordinary JavaScript programmers to get involved."

possibly the most important of all the languages at this point in time

Not so sure I'd agree with that summary - I don't doubt the importance of JavaScript to the modern internet but I'd be more inclined to consider the C's of this world as the main foundation of the industry.

Just wait until you've had to fix your first Node.js and MongoDB disaster. I'm working with one client to get rid of such a system. It is by far one of the worst gigs I've ever had, and I've had to clean up a whole lot of stupid shit before.

JavaScript barely works as a client-side scripting language, and even then the experience is totally shitty for developers and users alike. Slashdot is a really good example of how JavaScript can absolutely fuck up a site unnecessarily.

But it has absolutely no place for server-side development. It's just not up to the task in any way. It's missing basic language features necessary for large-scale server-side development. Its development tools are atrocious. Its runtime performance is horrible. Node.js is fucking stupid, and that's putting it nicely. Using it to query a data store is an extremely idiotic idea. All in all, it's a massive failure.

JavaScript "programmers" have put together some of the worst and most broken systems that I've ever dealt with, and I've been dealing with horrible systems written using languages like PHP, Visual Basic, PowerBuilder and Perl. JavaScript may be one of the biggest computing disasters of all time.

I'm not sure if you're going for funny or not -- Just to clarify, CGI was traditionally done via C. Apache is written in C. To this day, I still write processor intensive server side code in C or C++ (with a few C libs to support cross platform code & CGI) -- Even dinky hosing services like 1&1 offer remote SSH, have C/C++ compilers installed (G++, GCC), as well as GIT.

I wouldn't develop on any system that doesn't at least support this minimal setup -- for web development or otherwise...

It's not cross platform -- The amount of conditional cruft you have to add to ATTEMPT a cross-browser solution is rediculous, so much so that there are entire libraries and frameworks for client side JS just to get most of the way there, and even then, some browsers are left behind.

It's not sand-boxed -- Modern browsers compile JS to machine code and run that... Because the language requires features that make it slow, to do it any other way (bytecode in a VM), is terribly slow.

I use JS, but it's not all it's cracked up to be... Most devs I know only use it as a client side language because it's available -- not because the language is so great.

It's not cross platform -- The amount of conditional cruft you have to add to ATTEMPT a cross-browser solution is rediculous

That's not the problem of javascript - that's the problem of implementation. Do you really think it would be any better if someone invented SomeBetterScript back then - and then MS made EvenBettererScript, which would be almost, but not completely, unlike the SomeBetterScript, and then Mozilla added their own extensions, and then other browsers implemented those extensions in incompatible way, adding some of their own in process, and...

You see what I'm talking about? JS by itself is quite nice language - web client bindings for JS is awful thanks to all the implementors.

And yes, libraries and frameworks are good thing and they do make browser JS crossplatform - think about how AWT/Swing/SWT makes Java crossplatform and what would happen if you had to have your own bindings and workarounds.

It's not sand-boxed -- Modern browsers compile JS to machine code and run that.

And that's pure bullshit. "Compile to machine code and run that" has nothing to do with sandboxing - that's what all the languages that give a bit of concern about performance do, after all. Please come back when you learn the difference between "sandboxing vs non-sandboxing" and "interpreting vs JIT compiling".

I disagree. The web browser has to parse all of that JavaScript every time it loads a page with JavaScript that requires it. In most cases it's not even really necessary, as the JavaScript only uses a couple library functions that could easily be written by yourself (like XmlHttpRequest).

Virtually every device has substantial amounts of code written in C or C++. Javascript would be useless on the microcontroller I write C code for. If C and C++ were to vanish overnight we'd be back in the stone age. I won't comment on whether C and C++ belong in the stone age, but it's great that many programmers don't have to think at the lower levels of machine abstraction.

Virtually every device has substantial amounts of code written in C or C++. Javascript would be useless on the microcontroller I write C code for. If C and C++ were to vanish overnight we'd be back in the stone age. I won't comment on whether C and C++ belong in the stone age, but it's great that many programmers don't have to think at the lower levels of machine abstraction.

Well, think about this for a second.

Of course most compilers for embedded systems and small microcontrollers are C

However, I wouldn't put past someone to take a subset (or just core JS and some metadata) and write a compiler. And I think it would be very good.

Of course, the processor would have to have at least some 100's of KBs of memory, but I think it would be amazing.

Think about this: JS is already very similar to C. Add Arrays, Maps and first order functions.

JavaScript isn't even that important to the modern Internet. It's pretty isolated to the Web, and even there it's only seriously used by a small number of sites. It just gets a lot of undeserved hype.

Indeed, C and its derivatives and related languages are in fact the main foundation of virtually all software. For every line of JavaScript in a given web site, there will be hundreds, if not thousands, of lines of C or C++ code doing the real work within the JavaScript interpreter, the web browser, the client'

JavaScript isn't even that important to the modern Internet. It's pretty isolated to the Web,

Yup, and the web isn't very important to the modern Internet at all.

and even there it's only seriously used by a small number of sites.

Just a few tiny, insignificant ones like Slashdot, Google (Docs/Maps/GMail) and any other website that contains anything more interactive than a form submit button. Except the ones that use Flash (but then the ActionScript language used by Flash developers is a superset of ECMAScript.) - or Java, which really is "only used seriously by a small number of sites" (for given values of "small" and "seriously").

Its also the only game in town if you want to target iOS, Android and desktop browsers with the same codebase. Meanwhile, Java's star seems to be falling,.Net/C#/VB (however well respected) are effectively Microsoft-only.

For every line of JavaScript in a given web site, there will be hundreds, if not thousands, of lines of C or C++ code doing the real work within the JavaScript interpreter

Well, yes, that will be true of any "scripting" language.

The statement in TFA that Javascript is "possibly the most important of all the languages" is flamebait, but your position is equally absurd.

The "contest" is probably Javascript vs. Python/Ruby/Perl/PHP. ("CoffeeScript", mentioned in TFA seems to be an effort to make JavaScript look more like the first three of those to appease the haters of curly brackets - where's the campaign to make Javascript look more like PHP, I ask !?:-) ).

...and many people locate the BitTorrent they want to use by searching on the web.

Then there are more traditional uses like FTP

...and many people locate the file they want to download by FTP by following a link on a web site (assuming they don't download it using HTTP).

and email.

Which many people now access via a webmail application such as Gmail or Outlook Web Access - and while they aren't going to supplant email anytime soon, people are increasingly using social networking sites like Facebook and Twitter to communicate.

Voice and video teleconferencing are always becoming more prevalent. Then there's also gaming.

...and people don't use the web at all to locate people, find game servers, find out about games or even play them on line?

Don't forget DNS. And there are many other more technical uses that I know you won't be familiar with.

Actually, I've been using the Internet since before the web existed, and I've even written POP and SMTP clients in lovingly hand-crafted C so cut the patronising crap - I do actually know the difference between the web and the Internet. The argument was whether the Web was an important part of the Internet - not whether it was the only use.

Sites that use Flash and/or JavaScript heavily tend to be rather useless. Slashdot has gotten progressively worse to use as more JavaScript has been introduced.

OTOH, sites like Google Maps and Docs use it to great effect. I'd agree that Slashdot is a less than stellar example (and I'm not quite sure why it needs so much scripting to do what it does).

Likewise, Flash doesn't really add anything useful to the table.

Vector graphics and object-based animation that scale nicely without having to be coded from scratch? Its particularly suitable for things like online tests and educational applets. Again, it can be abused by using it for things that could/should be done in plain old HTML - and its use it for animated/interactive ads may be annoying, but that doesn't make it insignificant. Plus, all the people flaming iOS because it doesn't support Flash presumably think its good for something. For my money, it ought to be replaced by HTML5+SVG+DOM+CSS+AJAX+Javascript in the long term, but the development tools aren't there yet.

We could download and play games long before Flash existed.

In a format that would run unmodified on Windows, Mac, Linux on some mobile devices? Well, yes, there is Java - although I've found Flash to be more consistent cross-platform and easier to deliver (the plug-in is a simple download which most people already have, and its trivial to package Flash as stand alone.exe or.app files that run without plugins) and Flash's graphics engine is perfect for simple 2d games. Java may be better for complex stuff Minecraft, but if I wanted to write a poker app I'd choose Flash (until/unless SVG is properly supported across browsers). Plus, Flash is biggest in "on line" games like Farmville, which are tied to web-based social networking.

We could stream videos using RealPlayer and other technologies long before YouTube existed. In fact, those real applications are often much effective to use than the Flash- or JavaScript-based "equivalents".

Sometimes the issue is not just technical. Macromedia/Adobe give away the player plug-in, make their money selling tools to content creators and only bug users when an update to the player is available. RealPlayer were continually trying to push their premium media player software and content on your users. You could tell users to go install Flash player without them coming back and asking if they had to pay (because Real had made the "Free Playe

Why measure importance based on bits transferred, what about time spent?... I think human minutes is a better measure of importance than bits transferred.

Well, maybe, but we should be careful with such comparisons. It strikes me as similar to the common practice among management of measuring programmer productivity by counting the number of lines of code produced. It's hard to imagine any worse measure that lines of code, but "time spent" could be a good challenger. Do we really want to encourage management to measure our productivity by time spent?

After using JS on any number of projects, I'd have to say that the time I spent versus the useful results

Your issue there isn't with JavaScript. It's with the lack of a standardised DOM.

Anyway, all the arguing about how good JS is is irrelevant. You're right. As programming experiences are, I find it's pretty shit developing web apps compared to desktop apps. But the real discussion here should be about its claimed importance, not its merits.

"At this point in time" however, the Cs are just doing the same as what they have been doing for decades, whereas JavaScript is becoming a more and more important part of rich, highly cross platform applications. C is good for that too, of course, but it tends to just be a part of the background implementation just now. It is generally not a driving force or limiting factor in how we choose to implement high level applications, whereas JavaScript is.

Already other comments are streaming in (dynamically via JavaScript!) pointing out how basically all devices have software written in C at some level. I know this, the submitter probably knows it, and it doesn't change which is more important right now. For example, JavaScript has done more for making Linux viable on the desktop than C or Java ever has. So many apps these days can be written as web apps, and run on any OS and any hardware, as long as they have a decent web browser. It is currently changing, and will continue to change how we use our computing devices.

JavaScript has done more for making Linux viable on the desktop than C or Java ever has. So many apps these days can be written as web apps, and run on any OS and any hardware, as long as they have a decent web browser.

And that's great for them, but the fact that so much of our time is now spent on web based services is what is making it easier for people to move away from Windows if they wish, and JavaScript is the thing that is making these sites pleasant to use (perhaps not pleasant to create of course, but at least you only have to maintain one version rather than many apps), when done correctly. I like that Facebook works the same on Windows, Ubuntu and my Android devices. I much prefer the desktop version of Facebo

Since your comment was so entertaining, perhaps you will tell us why you won't subscribe, if it doesn't come down to simple incompetence.

It wasn't my comment that you replied to, but I can explain why I don't subscribe to a lot of sites that I read regularly. One of the main reasons is that my file of sites/logins/passwords has grown to over 200 entries, and I'm starting to consider this a major security issue. Someone who gets their hands on this file could become a real PITA in my life. So I'm looking for ways to minimize the possible damage from this incoherent pile of security data.

mod u up. Must have been a web developer who summarized the article. It's so irritating to me with the advent of web 2.0, that everyone is so focused on the web. Not the internet, the web. As a result i feel like innovation and creativity have been lost in a sea of AJAX, PHP scripts and social networks.
I'm very sad to see the commercialization of the internet and seeing us so focused on just this one aspect of computing - which is really just the GUI to the net. I'm hoping thinks like the Kinect will stir

C (and its derivatives) power almost all web servers in use today, and without servers, there's no web.
Most browsers are written in C++ today, JS saw the light of day working inside browsers.
A lot of programming languages have their main implementations written in C (Ruby, Python, PHP).
So, yes, C/C++ are still important in the days of the web.

While the JavaScript language, development environments and implementations are absolutely terrible, as I see you're well aware, those are not the worst parts of it all. By far, the community is the most atrocious thing related to JavaScript. The people are generally nice enough, but my gosh, are they ever ignorant when it comes to computing.

JavaScript tends to drive away everyone who is even remotely a good programmer, as such people can usually see just how flawed JavaScript is, and they want nothing to d

I doubt that web browser developers are very inclined to put massive work on coding a python interpreter and optimize it for web. Power of JS comes from the interpreters that web browsers have today. Python framework with web browser would be a new start and interpreters of web browsers would give you more trouble than JS language gives you now. Of course in a long-run it would be better. But JS does the job, so why devs should bother if there is no apparent need for other scripting languages.

I know this may be considered radical and groundbreaking for those who design the language but perhaps putting in some way of letting the developer decide if he/she wants to copy an object or just create a new reference to it when doing assignment?

Yes, I just re-read what I've been posting here and I'm not sure what exactly I was trying to argue. I'm going to blame my hangover and the fact that I spent several days last week battling datetime handling in JavaScript (which I will maintain is an abomination simply on the grounds of not supporting ISO-8601 formatted dates in a world where everything else uses ISO-8601).

I'm actually baffled by the fact that I managed to crank out two posts before realizing that I was rambling incoherently. Now I'm going to go drink some water and try to make my headache go away...

You're arguing for value semantics (as in C++), as opposed to reference semantics (as in Java, Python, and JavaScript). In the latter languages, what many programmers think of as objects are really references to objects. In your code, myObj and newObj are two references that point to the same single object. Don't feel bad -- I talk to many Java programmers who still don't quite grasp this concept. Good old pointer diagrams make it clear; get a book that shows references as boxes with arrows that point to the objects they refer to.

Yes, but (speaking as a language designer and implementer, and this problem is a big one) you more or less admit that the program doesn't stand on its own ("Good old pointer diagrams make it clear"). One problem with programming languages is that they are often designed by compiler writers or interpreter writers, for whom such pointy data structures are ever-so-useful, but in the so-called real world, by-value is much more often the case. "Value" is also much more useful when you set out to do things in n

It's similar to the situation with floating point arithmetic. Who hasn't been caught off guard when 1/3 + 2/3 != 1? It's just something you need to know when you write programs -- floating point numbers are approximations of real numbers. Another thing programmers need to know is that pointers (or references) point to other entities.

How sure are you that you really do need to know that? 1/3 and 2/3 are rational numbers, we can represent them exactly, and make the identity work. That's one way around the problem. Another way around the problem is to always emit a warning when the = and != operations are seen applied to floats (this won't solve all problems -- is 1/3+2/3 less than 1?).

People get extraordinarily comfortable with their status quo, and assume not only that that is how the world works, but that is how the world MUST work.

Yeah, but if you're designing a language, people like that are going to be using it. If your language design can help confused people (i.e., most people) avoid "stupid" (i.e., human, common) errors, that's a win.

Because otherwise, it devolves down to one of those stupid "real men spray III's and V's onto Si" pissing contests.

The problem with javascript is that it is one of the WORST languages and environments. I dare to say Brandan owes the whole industry a great big apology. If he were japanese, there is a traditional act he should perform. Javascript doesn't have types to speak of, doesn't handle numbers very well, I mean seriously "+" appends two numbers? No scope to speak of. It looks object oriented, but has no real notion of classes. No inheritance. All of the features that have made languages "safer" and "easier" to prog

It looks object oriented, but has no real notion of classes. No inheritance.

I agree with most of your points, but not this one. Class based and object oriented are orthogonal. Simula was class based, but not object oriented. JavaScript and Self are object oriented, but not class based. And JavaScript does have inheritance, a reduced form of the same differential inheritance that Self has (only one parent, can only be assigned at construction time). New objects inherit from the object in the prototype field of the constructor object.

They are "object oriented" only because they insisted on using that buzzword. It might be reasonable to call them something else.

But whatever you call it, it doesn't work well. Prototype-based "OOP" seems conceptually simpler, but it ends up being more complicated and harder to maintain in practice. It was a good thing to try, but it's an experiment that has failed.

I really don't think you understand Javascript quite enough to be commenting this strongly on it.

It does have types, to speak of and to use. They are in the spec, they are in the language, they work. http://bclary.com/2004/11/07/#a- [bclary.com] Since when has "+" appended 2 numbers? When they are strings I would imagine, which is exactly what most languages do. You might need to get your head around javascript types to stop this happening.Javascript has scope -- it's quite well defined.Object oriented does not mean clas

Two objects that have always contained numerical values, were assigned numerical values, are treated as strings. You always have to explicitly cast them as numbers. Which is bogus. Just the typing alone hurts my fingers.

Javascript has scope -- it's quite well defined

Yes, its well defined as being almost pointless, but, yes, you are right it has "scope."

Object oriented does not mean classes

Umm, yes it does. You may call them what you wish, but "object oriented" has a definition and means something. Inheritance, polymorphism, etc. Of which, javascript has not.

a programming paradigm using "objects" â" data structures consisting of data fields and methods together with their interactions â" to design applications and computer programs.

I know you probably only went to a school which taught Java, and so don't understand that object oriented programming can look like something other than Java, but, in fact, it can. Classical inheritance, polymorphism, etc, are good companions but they do not define the par

It's not. It's part of definition of class-based OO, but there's also prototype-based OO (which JS belongs to), and other more exotic schemes.

in fact java is not seen as pure OO because it dissalows multiple inheritance.

I've no idea where you've got that one from. Java is not seen as pure OO for the sole reason that not all of its values are objects - it also has primitive types such as "int" and "float", values of which aren't objects, which do not participate in type relations etc.

In contrast, in e.g. Python every int is an object (implementation-wise it's optimized, of course, so

Two objects that have always contained numerical values, were assigned numerical values, are treated as strings. You always have to explicitly cast them as numbers. Which is bogus. Just the typing alone hurts my fingers.

person.name - this is 'Bob'person.gender - this is 'male'.person.toString() - this is 'Bob male'.

No inheritance

- well, there is the keyword "inherits" and it does allow an object to be extended and you can use the 'prototype' to have multiple inheritance.

--I am not saying this language is wonderful, whatever, but saying it is lacking various features, that it clearly has, even though they look different from other languages... it's disingenuous.

As to the question whether this language has anything that others do not, again, how about on the fly reflection via evaluation of strings into objects? When I first saw that over a decade ago, I thought it was a neat concept then, I still think it's a neat concept today.

well, it's not true really. In the following example x will have global scope and y will be local to its function:

x=2;function test() {

y = x + 3;}

--

The declaration of the variable 'y' in the example is missing its keyword 'var' and will unintentionally create a globally scoped variable.Should be:var x = 2;function test(){var y;y = x + 3;// could also place the var statement inline with assignment operator}console.log(x);// prints 2 to the JS consoleconsole.log(typeof y);// prints 'undefined' to the JS console

- well, there is the keyword "inherits" and it does allow an object to be extended and you can use the 'prototype' to have multiple inheritance.

The prototype inheritance pattern doesn't allow for true multiple inheritance (like what C++ has). However, you can fake it by munging function

well, it's not true really. In the following example x will have global scope and y will be local to its function:

x=2;
function test() {
y = x + 3;
}

Have to correct you here: In JavaScript, ALL variables that are not explicitly declared with var are declared global. It's without question the worst "feature" of JavaScript. In your example, both x and y are global. The correct example code is:

x = 2;
function test() {
var y = x + 3;
}

However, like almost all problems with JavaScript, running your scripts through Douglas Crockford's JSLint [jslint.com] (and strictly adhering to it) pretty much eliminates that issue. It can be run on the command line as part

No, you did not "make a syntax error", and that is the problem: your code was wrong, but it was syntactically correct. You made a mistake that led to a variable having the wrong scope. Those can lead to extremely hard to find bugs.

JavaScript's choice that the default scope is global is bad language design. The two choices that actually work are: default local scope or requiring an explicit declaration for all variables.

It's not a syntax error. A genuine syntax error would have saved you early. Instead it's a bug waiting to happen. And the only reason it's a bug is that javascript is so effed up. having Global be the default scope within a function is just plain crazy.

and here is what I told the other guy [slashdot.org], who mentioned the same thing.. and the other guy.... and there will be another.../. is great for this reason alone - it's like a place where the memories are gone instantly... the stories are reposted, but so are the same comments, over and over, it's like an obsessive compulsive schizo, it's clear the comments are unnecessary, but he can't help himself.

It may have some of these features, but they are a trap. People talk about it being object oriented and so on, and there are some features that make it look that way, it is a trap that leads you into programming hell.

Then the fact that every implementation of the language is some variant or another and you need browser specific code in real world Javascript.

All all of these libraries? They sound nice and all, but when you go to use them you quickly find yourself dropping to low level

Then the fact that every implementation of the language is some variant or another and you need browser specific code in real world Javascript.

This is in no way a fault of JavaScript or even the browsers implementation of JavaScript. This tends to be because of the libraries you might be using through JavaScript, such as the DOM library or other built in browser functions. Those have nothing to do with the language that is JavaScript, though people confuse them a lot. This would happen with any language, as it's up to the browser implementor to implement the libraries. Try using JavaScript in a pure JS interpreter without calling out to other

Javascript doesn't have types to speak of, doesn't handle numbers very well, I mean seriously "+" appends two numbers? No scope to speak of. It looks object oriented, but has no real notion of classes. No inheritance.

You clearly know nothing about javascript. JavaScript has types. These types are not dynamic but can be auto-cast using clearly defined rules. JavaScript is fully object oriented. All types are objects, even functions. JavaScript has very clearly defined scope (you use curly brackets to define all scope in JavaScript). With out scope there would be no possibility of closures. And of course JavaScript handles numbers just fine (excusing the standard IEEE oddities that all IEEE compliant languages have).

""+" doesn't append _two numbers_, but it can append _number to string_ - which you can have in any language with operator overloading."function foo(x,y) { return x + y; }foo("5",6) == "56"

In every other language I've seen, the CORRECTly expected result is 11 or error. Perl, C++, etc. The point is that you can never trust your input if you are expecting numeric.

You must guard the inputs with explicit (and thus inefficient/unreadable casts). If you're using a 3rd party library, you'll be pulling you hair out trying to figure out what went wrong.

The language is full of such wtf's. While you can happily redefine most core operations (e.g. how jQuery fixes IE), you can't overload the + operator. Call my cynical, but I don't like languages that let you corrupt basic building blocks.

That being said, javascript is excellent at what it was designed for - and passable for what it's currently used for, but I fear for the future if it's the basis of future industrial strength applications.

One place I WOULD like to see it extended is DB's.. CouchDB has a very nice java-script based map-reduce framework - it leads to concise and expressive code (that's really NASTY if plsql, etc are used).

Basically javascript is excellent fragment-code. But HORRIBLE for modular libraries - having to write an entire library (like jquery) in a scoped wrapper then assigned to a mutable/corruptable symbol is sick. (Especially since library A will mutate library B without permission - hello 1970s!!)

What's weird about that? It clearly produces a pointer to that '2' in the middle of the quoted string. Isn't that what any sane programmer would expect?;-)

One of my favorite C examples is a variant on the above:

i = 257 & 0x0F;
c = "0123456789ABCDEF"[i];

Actually, my main comment about such examples is that, if you can't instantly explain what each is doing (and why the second is safe), I'd be nervous about hiring you for a C project. Unfortunately, most of the people doing such hiring can't explain either of these examples, which explains a

""+" doesn't append _two numbers_, but it can append _number to string_
In every other language I've seen, the CORRECTly expected result is 11 or error. Perl, C++, etc. The point is that you can never trust your input if you are expecting numeric.

It's also wrong in Python, in an even worse way. The problem began with using "+" for concatenation. Concatenation works on sequences, so [1,2] + [3,4] in Python is [1,2,3,4], which is not what one might expect. Worse, Python has both sequences and numeric arrays (as part of the standard NumPy math package), and numeric arrays have different semantics. Adding numeric arrays with "+" has numerical semantics - you get a vector addition. Still worse, mixed mode addition between sequences and numeric arrays

They're merely hacks to get around some serious deficiencies with JavaScript...

Much like libc is just a hack to get around some serious deficiencies of C.

Query, for instance, wouldn't even be useful if the DOM weren't so horribly fucked up

And what does that have to do with JavaScript? You are saying the language is bad because one of the libraries is bad. Might as well go around complaining about every language at that point, because they all have some screwed up libraries.

I don't see where anyone has yet mentioned Doug Crockford's excellent videos on JavaScript. These are all on YUI theater. http://developer.yahoo.com/yui/theater/ [yahoo.com] All the criticisms mentioned here are discussed in depth. Crockford deals with the good and bad parts of JavaScript from the perspective of years of detailed research on it. And like it or no, JS is available in a useful, common subset on all modern browsers. The whole HTML, CSS, DOM, JavaScript ball of wax is a kludge that happened by the ch

The C++ standards committee has been lost in template la-la land for the last decade. They've focused on features understood by few and used correctly in production code by fewer. Since the discovery that the C++ template system could be abused as a term-rewriting system to perform arbitrary computations at run-time, that concept has received far too much attention. It's an ugly way to program, but it's "l33t". On the other hand, they've been unable to fix any of the fundamental safety problems in the language. C++ is unique among mainstream languages in providing hiding ("abstraction") without memory safety. (C has neither, Simula, Pascal, Ada, Java, Delphi, Erlang, Haskell, Go, and all the "scripting languages" have both.) So there's an example of a committee screwing up.

On the Python side, we have von Rossum. The problem there is that he likes features that are easy to implement in his CPython implementation, which is a naive interpreter, even if they inhibit most attempts at optimization. As a result, Python isn't much faster than it was a decade ago, and is still about 60x slower than C. Attempts to speed it up have either failed or resulted in insanely complex, yet still sluggish, implementations. So that's the "guru" approach.

Most of what is bad about JavaScript isn't so much the language, but the environment. People still drag their IE6 on to the internet and they expect things to work and it seems many website builders try to oblige them. They work around the problems in that 10 year old software. Many things have changed since then.

I do not know what you do, but let's say you are a Java programmer, when you code do you always keep in the back of your mind it should still work on the 1.3 runtimes from 10 years ago ? And do you

If you are a C-programmer developing programs which run on Linux, you always create 2 code paths ? One which compiles and runs on Linux 2.2.x/gcc 2.95.x ? And some #ifdef where you take advantage of the newer features which Linux and other modern operating systems offer you ?

Yes. You know that "./configure" command which is the first step in building 99% of the source code packages out there? Guess what that's doing. Setting up a bazillion different #defines to adapt code to different platforms, architec

I could also ask you something else: do you think you could have created something better in 10 days ? That is the time it took Brendan Eich to create something from scratch, 10 days from doing to language design to a working and shipping version.