Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

snydeq writes "Fatal Exception's Neil McAllister discusses the proliferation of programming languages and what separates the successful ones from obscurity. 'Some people say we don't need any more programming languages at all. I disagree. But it seems clear that the mainstream won't accept just any language. To be successful, a new language has to be both familiar and innovative — and it shouldn't try to bite off more than it can chew. ... At least part of the formula for success seems to be pure luck, like a band getting its big break. But it also seems much easier for a language to shoot itself in the foot than to skyrocket to stardom.'"

There's plenty of women in the history of science, and in computer science. They're all... well they're not beauty queens. In fact there's very few beautiful women in science at all, and of course very few beautiful men (or so I hear from...).

And this is good. It means that they actually did something important, were actually capable of doing interesting things.

Master Po: Patience Grace Hopper. If a man dwells on the past, then he robs the present. But if a man ignores the past, he may rob the future. The seeds of our destiny are nurtured by the roots of our past.

What you say is utter bupkis and bullshit. There are many relatively modern languages that have become very popular without having any sort of real compatibility with C, C++, or any other programming language.

Just look at Perl, Python, Ruby, Java and C# for some examples. Those have all arisen in the last 20 to 25 years, well after C was extremely well established. While they can call out to external C code with varying degrees of difficulty, they aren't code-compatible with C in any way. SO HOW IN THE FUC

Yeah, him saying "call out to external C code with varying degrees of difficulty" is purely disingenuous. There is NO difficulty; it may be slightly tedious, but really, no, it's not a big deal. It's quite easy to get a Java/Ruby/v8 (node.js) application bound to an existing C/C++ code-base.

In fact an awful lot of the power of Python comes from the ease in which you can string together a bunch of stuff that uses the sophisticated underlying C libs in a RAD style. It's part of the beauty of the whole thing, and part of the major appeal of python.

That's the great thing about the JVM... you can try out different paradigms, but you can always reuse the code, no matter if it is written in Java, Jython, JRuby, or any of the more experimental languages I don't even know about.

First of all, "Write once, run anywhere" never meant to apply to all versions. It meant to apply to different platforms. By your logic, no language could every grow as backwards and forwards compatibility must be maintained at all times. Second for the most part, java is fairly backwards compatible. I don't know what kind of code you are writing but programs I wrote 10 years ago still compile and run.

Yes, or at the very least, make it very easy to link to and call other languages. Mathlab/Mathematica can call C libs rather easily. C can call Fortran and Pascal. But some won't mix and that's a big show stopper.
Another issue I think is with languages that give you too much leeway. If you can redefine the syntax, the operators, how they work, etc, then another programmer, even familiar with the language, will have no idea what you wrote. Case in point, C++ where you can overload '+' to operate 30 differe

One would hope that a particular library/framework that is being used would be familiar to the people who maintain the code. If that's the case, then I don't see how operator overloading would be considered a drawback. Alas, IMHO C++ operator overloading is, to an extent, a crutch needed because the language is not very expressive. For example, in C++ the most concise way of setting up a constant type-safe matrix might be like this (this syntax is from eigen):Matrix m;m 1, 2, 3, 4, 5, 6,
7, 8

It seems nice when you first read about it but ends up a maintenance nightmare.

No, it doesn't. Anyone who understands C++ reads operators just like functions.

You see +, and translate it in your head to a call to a function called "plus" (technically operator+()).

Well, guess what, no language prevents you from giving functions stupid names. You can write a function in any language called "add" which doesn't add things. Just as in C++ you can write a function called operator+() which doesn't add things. There is *no* difference.

Operator overloading when done well vastly decreases maintainance required because the code becomes much, much simpler to read.

The limits that would impose on syntax and underlying data models would essentially stop all real evolution in this space. Thank your lucky stars this attitude is not pervasive or we'd all be working with slightly improved BASIC, FORTRAN, COBOL, and Ada environments today. The lucky ones would be maintaining code bases comprised of horrible COBOL to C conversions. If you have been in the retail or finance industries you have encountered some CBOL at some point.

Which is why COBOL is still alive, too much legacy code and too many libraries to re-implement. That doesn't really say much about new languages though, C to C++ is more the exception to the rule here in that it was an extension to C but C itself kept living its own life. Most other languages just grow and grow with one new feature here and one there and a few things deprecated but never really gone. And that's really why most new languages appear, to get rid of all the crud. To get rid of all the legacy code. To get rid of unsafe methods, stupid interfaces, stupid syntax, stupid keywords, inheritance systems, constructors/initialization and whatnot. I program in Qt/C++... but I'd just love to redo it without all the C-isms and take the best from Java and C#, it'd just be a helluva job. Many of the popular languages have had a huge corporation backing them - Sun for Java (now Oracle), Microsoft for C#, it's not just a language but today I'd also expect a fairly complete standard library (which is why I said Qt/C++, I'd not do plain C++) and that's a lot of work.

I'm not sure you can say that. I went to college back in the late 80's and early 90's and was subjected, among other things, to Ada. The military apparently liked Ada mostly for its extra-strong typing. You really don't want to find out that your programmer made a type error when the missile is already in flight, I guess. I've done a lot more with C than C++, but have already had C++ catch enough obscure type errors with the code I'm using to experiment that I'm pretty much sold on the language. Enough to add it to my toolbox anyway. It's hard to say whether I'd do that if I didn't have the advantage of 20 years of programming in a similar language. I certainly didn't with Ada. But I think C++ can stand on its own. There are fairly few things you really need to pick up to be able to program well with it.

The main one is: Avoid template fuckery. You know that book by Alexandrescu? Don't do that! Well unless you're writing Boost. It's too easy to abuse for no real gain, like the Singleton pattern. Most programming teams will refuse to allow you to use a singleton now, even when it's justifiable, because of the abuses to the pattern. If you're factoring primes with recursive templates at compile time, you're creating a maintenance nightmare and future generations of programmers will curse your name. Though I still think it'd be a fun exercise to do a compile time matrix multiply library...

Not that you can completely avoid using templates in C++. But if you avoid it for as long as you reasonably can, your chances of actually using them well when the time comes are much better. You'll probably still get burned by odd template rules, but at least most of your code won't have that problem.

I suspect that the "Java failed" post was a sarcastic counter example of a language that stood on its own.

Java is unusual because it had a billion dollar marketing push explaining how it would change everything. Managers were taking Java classes to learn how it would change everything. None of this was aimed at the enterprise. It failed to be adopted most places it was aimed and somehow backed into the enterprise area due to lack of competition among Microsoft alternatives.

While I do worry about skills becoming obsolete, it seems like us C and C++ programmers are never far from a job because there are so few of us around compared to web dev, java and other higher level developers.

It sounds like sarcasm to me. After all, Java is, in any measure, a successful language and platform. A considerable number of respectable higher education institutions have adopted java as the programming language for both OO courses and intro to programming, and any job search returns a high number of jobs which explicitly require proficiency in java programming.

If that wasn't enough, there's android developing. Nowadays, if you want to develop software for a smartphone you are basically forced to write

Java is very much alive and thriving on web/enterprise application servers and on mobile devices. What do you think Android apps developers program with? What about sites that make use of JSPs, struts and spring? Here's a partial list of sites built by struts:
http://wiki.apache.org/struts/PoweredBy [apache.org]

It came out nearly the same time as C++ which. You had to buy Objective from its vendor or Apple which C++ was was free. ObjectC was ahead of C++ in quality for many years, but languished due to its price. I used ObjectIveC as a NeXT computer developer.

There is no magic formula. But there are some simple things that I find help me that have NOTHING to do with the language itself, or it's technical advantages/disadvantages. Strangely, they correlate in no way to popularity of the languages

You need a single document to sell me your new language. If you can't explain the concepts, basically, to a programmer in a page or two (enough that if you try to sell EVERYTHING I get bored reading the document as a whole), then it won't wash. If I can't understand why I should use your language, I won't. (Spreading it across a Wiki doesn't count, unless that Wiki has a complete copy available as a PDF or something readable.)

Your documentation should also help when I have a "how the hell do I do X?" question.

You shouldn't just assume that your way is the best. Ever. Just don't. It'll annoy me.

You shouldn't just assume that I'm happy to spend a year learning the quirks of your language.

I should be able to knock up a quick sample program, that uses one of your new features, and understand it in a matter of minutes. Literally. Minutes. Including downloading and installing your compiler / interpreter and getting it running.

Google sort of understood this with Go: http://golang.org/ [golang.org] They have all of the above, and even an online "compiler". They fail a tiny bit with "what's new" and selling the language, really, which is a bit of a shame, but they do a good job.

Ruby does okay too.

But PHP, one of the most popular languages, has a web-site that doubles as a bomb-site. It's hideous and has always put me off, even if they do have some of this information hidden away. It's not selling the language at all(presumably because they're "big enough" for everyone to just know about it). It's like reading a security/release-mailing blog sometimes.

C# doesn't sell the language at all, anywhere, online as far as I can tell. The first hit is Wikipedia. The next few are resource sites.

As far as I can see, C# succeeded because it was backed by a big company. By contrast, Go is still pretty obscure (which shows you there is no magic formula - Go aces a lot of the checklists but still lingers in the background). PHP succeeded because it was quick, simple, powerful and "came first" in terms of web scripting. It also created one of the web's largest security nightmares, which was something it was supposed to replace (Perl CGI).

C was popular because it was unique at the time, and powerful. C++ was popular basically because C was (that doesn't mean it didn't have advantages too, but it got popular by riding along - not by it's own merit at first, but that's what HAS kept it in place ever since).

There's no way to predict a success. Ruby / Rails came out of nowhere as far as I'm concerned and Ruby's been around since the 90's (Has it? Really? Bloody hell! Where was that hiding?). But things like Haskell were around too in that time and have never really caught on.

It seems the criteria are "ready - while being in the right place and right time", and almost the inverse of what you'd expect given a look at how much they want to ease programmers in. It seems that if you want to stand a good chance of being the next-big-thing, make an awful website, don't put up examples, make the simplest thing complicated or impossible, make an horrendous security mess, and then put it online. Then find the next fad, say your language is perfect for it, and push it everywhere you can.

But things like Haskell were around too in that time and have never really caught on.

Huh? When Haskell came out it was a replacement to Gopher, a language not even terribly popular with the functional community. Haskell is now by far the #2 most popular functional language, passing even languages like Mathematica. Haskell has become the primary language of compiler design with ideas from Haskell leaking into most compilers including most importantly the Visual Studio compilers. Ideas from Haskell have led to whole new classes of languages like Scala and Clojure. Ideas like lazy data structures are become standard approaches in many languages.

Further Haskell has completely altered the entire way people think of functional programming. Monadic methods are now standard in most functional languages.

In what sense is Haskell not a huge success? Sure it isn't the mainstream language of choice, but then again a language that isn't good with interactive I/O is going to be unsuited for most day to day programming problems.

I think you mean Gofer [wikipedia.org] - which was an implementation of (an early draft of) the Haskell language. NB most of the innovations you mention (e.g. lazy data structures) were around before Haskell came on the scene:)

I was making a similar point here [slashdot.org] in a similar recent discussion [slashdot.org], saying that syntax isn't enough to capture the success of a language. You need to look at how accessible it is.

For me, there are three important points to discuss programming languages:

1. Syntax
2. Access
3. Community

ad 1) We know all about and can analyse the syntax. Fine. All the discussion happens here.ad 2) But what does the finest Haskell help me if I can't access a CD, Bluetooth or a XMPP serve

To be honest, I think C# was a big success partly as you say because Microsoft rammed it down everyone's throat with.net, and partly because it made Java programmers not wet themselves with fear.

I'd personally say it's 50/50 between those two. I know a lot of people who learned Java first that are absolutely petrified of ever having to actually understand how a computer works. C# appeals to that in a big way.

Well, one can program.net in lots of languages -- I've done it in Ada and Python, for example -- so MS didn't exactly ram C# down anybody's throat. But it does integrate particularly well with.net, and.net did make life a lot easier for MS Windows programmers. So yes, being backed by a big corporation helped, but so was J#. I reckon it also succeeded by making it easy (relatively familiar syntax for C++ and Java programmers) to do a job that enough people wanted to do (program MS Windows).

I HATE language selling sites. Many a new language or framework has a VERY nice presentation site telling you that THIS is the answer to your problems, this language will screw your gf, kill your dog and set your house on fire, letting you concentrate on the essentials of life. Coding.

PHP is totally different, it doesn't sell itself, it has no slick presentation, just every function described in clear plain English (and many heathen tongues like Dutch for those who were not blessed by god to be born in the

This proves my point, some people care about getting things done, about being able to find out how to get things done... and others about the font used for the logo.

Who cares about the function names? I can hit a dozen developers from my desk and every single one of them will have a different idea about the best naming schematic and formatting rules. I often use 3rd party libraries, I just adjust. Go ahead and work with something else if you want but don't complain when thousands who work in less exalted po

Whether or not a programming language succeeds has a lot to do with how available the tools are. The language must have a good IDE, quality debugger and profilers. If it doesn't have these tools, it's not much use to serious projects. Nobody wants to write a serious application without the use of a modern debugger. If the tools aren't available, are difficult to set up, or cost too much, people won't start using your language. There's plenty of free and really good languages with great tooling out there that you'd have to come up with something pretty extraordinary to succeed without a proper toolset around you language to succeed. Oh that and a big API that does a lot of the work for you. Nobody wants to write all their own libraries for doing things that should be included in the API.

I don't see any evidence that's the case. The whole debugger / IDE culture was built around a small subset of languages; essentially Algol syntax with static typing like C++, C#, Objective-C Java and Visual Basic.

On the other hand the major languages that have become popular in the last 15 years are often dynamically typed: Perl, Python, Ruby, PHP, Java Script. They don't have good debugger / IDE's as the technology doesn't exist yet.

Certainly the wealth of wonderful debuggers has helped the static languages. But they aren't a necessary condition.

Lack of proper debuggers are what keep me away from languages. Javascript without a debugger is fine if you want to write simple event handlers, but if you want to do a fully javascript driven site like GMail, then you're going to need the use of debuggers. Also, I'm pretty sure debuggers exist for all the languages you mentioned, so I'm not even sure what your exact point is.

Some level of debugging exists and there are semi-GUIs. But the kind of integrated syntax checking, debugging, IDE that exists for static languages don't exist for any dynamic languages. No one has figured out how to make them.

The concept might not even make sense. OTOH, the dynamic languages are much better at supporting an interactive model of development: try some things out at an interactive prompt, and cut-n-paste the stuff that works into a script file. Dress it up a little and you're good to go. It's a fast way to make something that works.

No debugging/syntax checking IDE for dynamic languages? Not if you count Smalltalk as dynamic. Smalltalk works arguably better for debugging, syntax checking, and more than static languages because of the VM concept.There can be a slight delay, especially if connecting to a VM remotely (Gemstone for instance), but Smalltalk's IDE even lets you program in the debugger. You haven't seen real TDD unless you've seen someone writing an entire app in the Smalltalk of choice's debugger. It also supports things lik

Point well taken about Smalltalk,. I'd agree that Smalltalk was a huge innovator on IDE/Debuggers. In general too many things resolve runtime for debugging in the classic sense to work well. So I give the static languages the credit for GUI/Debugger.

That being said there is no static language (except possibly Haskell) I'd rather write in that Smalltalk.

PHP has a built-in debugger, which NetBeans integrates with fine. Firebug has a debugger for javascript that works quite well, although you need to understand how functional languages work to really use it well. Syntax checking works as well as it does for any language. Can you actually point out something specific that Eclipse C++ does that Eclipse Javascript doesn't do? Or something that NetBeans Java does that NetBeans PHP doesn't do?

Javascript and PHP both have decent debuggers, not outstanding but they work. PHP has about a dozen IDEs ranging from ok to great. Dynamic typing doesn't really impact a debugger, all variables still have a type, it just isn't determined until the variable is set and it can change during execution. No different than a base pointer in C++ or a variable of type Object in Java.

Also never write Javascript as Java Script. It confuses people who may think that Javascript is related to Java, and shows your

Bingo! And for those reasons you state, Haskell fails. Haskell is a wonderful language until you attempt to do anything. The problem with Haskell is that it is essentially an academic's language with no support environment save some silly command line crap.

Seriously tho' - python, for example, is successful without having a good IDE. There are some IDEs that some people would argue are good - but most of the people writing python are using emacs or vi.

I'm also rather sceptical about the need for a good debugger. Most of the time I find writing a couple of simple unit tests and a putting in a couple of diagnostic prints is fine for figuring out what's going on (and you have the tests forever, which means that changes are less likely to introduce bugs in existi

The tools don't necessarily have to be written in the language itself, and while the bar of a language has been raised, developing an IDE have also become much easier. Nowadays it's trivial to write an Eclipse plugin for a language.

People posting hypothesis of what makes a language successful: if it predicts that Modula is extremely popular and PHP is essentially unknown, maybe you should revise it instead of blindingly post it just because you'd like that to be true.

A language will succeed if it is pragmatic, scratches an itch, is more productive than what exists already, is well supported preferably by multiple vendors, is cross platform, is simple to learn and offers familiarity with what has gone before. The further away from these ideals a language is the less likely it will be to succeed.

Could you expand on the Pascal limitations things? I used Pascal when i was student, even in the industry (petroleum refinery) and I never encountered any kind of limitations.

Now i use more frequently Python for 3 main reasons:* Avalability of powerful libraries* For a same functionality program writing from 2 time to 10 time less lines of code than Pascal, Java or C/C++.* Cover my old needs of Pascal (writing applications) plus writing some quick&dirty little scripts to help here and there

Carnegie-Mellon has dropped OOP from their CS requirements because they felt that the OOP model was not appropriate for modern needs. Linus Torvalds says "C++ is a horrible language." In the January issue of IEEE Computer there is an article "The Java Tree Withers - The java report card: infrastructure gets a D, code reuse gets an F".

Programming languages drive devices. I'm doing heterogenous parallel processing in C and CUDA. Multicore and massivelly parallel concurrency is absolutely the future and if you

I think it is a crime to let people out of CS school without knowing Java/C++ in this day and age. OOP has issues (I still think OO is a cult:) but you have to realize that there are a lot of practical and theoretical work done in OOP.

Also I was taught some functional programing in C. You don't really need a special language. Teaching someone Standard ML is a waste of the

I personally think that programming languages are a lot like medicine.

Your new language doesn't just have to solve a problem or two that you see with programming language, it has to do it better than existing languages while having less "side-effects" (quirks, difficulty, weaknesses).

If the advantages it provides aren't big enough to cover the costs (like learning a new language, using a new compiler, writing plugins to syntax-highlighting, etc. etc.) then they simply don't matter.

Make a programming language that's readable by humans, and have the language map concepts to code in the way humans think about things and it will be wildly popular (PHP, VB-anything, C#, Javascript/DOM). Make the programming language terse, efficient and mathematically consistent and it will be wildly popular - among mathematicians and gradually abandoned by most of the rest of humanity (e.g. Fortran, Powershell, C).

In the former cases, the machine does the heavy cognitive lifting. In the latter, you're expected to do it all. Guess why PHP is more popular than C++? Yes, PHP is sloppy, inconsistent and as random as the people who use it. That's the majority of folks who have to get some work done. As awful as the language is, it rules the web along with vb/asp and javascript.

Java succeeded because Sun 1) gave it away, and 2) threw money at giving it away. Remember "applets"? Java was supposed to be the programming language of the Web. That didn't work out. It ended up being the new COBOL, which was not Sun's intent.

Some languages fail, or get stuck, because the designer is in love with their own implementation. That happened to Pascal and Python. Wirth's own Pascal implementation was a cute little recursive-descent compiler that generated RPN byte codes, like a Java compiler. Wirth resisted changes to the language that would allow programming in the large. ISO Pascal reflects his biases. So Pascal became stuck in an educational niche. The original Macintosh software was all written in an extended Pascal, as was much '80s software. But everybody had a different dialect - there was Turbo Pascal, Clascal, and a few others. They never merged.

Modula, Wirth's second try, was also crippled in certain ways. Modula 2 was better. Modula 3 was good enough to be used to write an operating system kernel. Unfortunately, Modula 3 was only used with DEC, which died after being acquired by Compaq.

Python has some of the same problems. The feature set of Python reflects what it's easy to implement in a naive interpreter, like von Rossum's CPython. Internally, everything is an object, even integers and floats, and object access involves dictionary lookups. This makes CPython slow. Every attempt to speed up Python substantially has hit a wall, including Google's "Unladen Swallow" effort. (PyPy is making progress, but it's taken a decade and requires an incredibly complex internal combination of interpreters and compilers.)

The biggest disappointment to me has been that we're still stuck with C. C has two killer bad design decisions - the language doesn't know how big arrays are, and the "pointer=array" thing lies to the language. Both reflect how things are done in assembler, and the fact that the original compiler had to fit in a 128K PDP-11. Most of the millions of buffer overflows and crashes that occur daily can be traced to those two design decisions. (C++, as I point out occasional, tries to paper over these problems with collection classes. But the mold usually seeps through the wallpaper, since most operating system and library calls want raw C pointers.)

Judging by languages that have succeeded over the past 20 years, I would say that the main factor in success is a large company pushing the language. It seems that the average programmer is swayed by marketing just as much as much as anyone else. Then, beyond a certain threshold, network effects kick in. If you want to interoperate with another project, life is easier if you use the same programming language.

The average programmer doesn't get much say in the matter- it is the company that hires him that does.

However, yes, it comes down to marketing. Big companies like Microsoft have a much better chance of convincing a CIO that they need to be using their language.

Let's assume a theoretical company, Megasoft, produced a language Db - D Flat is much better than C Sharp - it is easier to learn- faster to compile- produces smaller.exes, runs much faster. It even puts the kettle on and makes you a cup of tea whilst you program (coffee if you prefer).

Which company do you think the CIO is going to go with- Microsoft with their flashy brochures- or Megasoft that no-one has heard of with their awesome product.

Right, the CIO will insist that all coding be done with Microsoft. Microsoft will no doubt have given him a t-shirt at the last trade booth. Thus, they are the obvious choice.

it is easier to learn- faster to compile- produces smaller.exes, runs much faster

Ok, lets talk about the debugging tools. Those are all nice things to have, and relatively important, but also probably not the main concern any longer, or at least not my main concern. Ever evolving hardware has managed to alleviate what may have been considered major feature points back in the day. I think ultimately when I'm "language shopping" debugging facilities would be my primary concern. Nobody is going to care about a language unless developers can produce working code in it.

If I were to choose between a language with a shiny debugger and a language with well-chosen features that allow the programmer to write clean, readable code that works right the first time, I'd choose the latter. Additionally, I wouldn't need the debugger afterwards.

To each his own. I'm well past the illusion of "not needing a debugger", that is pure myth when dealing with significant complexity. I've had to deal with enough strange code other people have written. And when it's not your code then "works right the first time" doesn't mean a damn thing anyway. Congrats on getting to an autonomous position where you write (or rewrite?) everything.

Then check out Gambit-C debugger (an implementation of Scheme).

The garbage I've had to deal with lately is the Altera and Xilinx debuggers which are both hacked up versions of the gnu /

Judging by languages that have succeeded over the past 20 years, I would say that the main factor in success is a large company pushing the language.

I'm having trouble thinking of any such languages other than Java and C#. I don't recall C (if you go back a bit more than 20 years), Perl, PHP, Python, or JavaScript becoming widely adopted because they were pushed by large companies (though I admit that JavaScript is debatable).

AT&T's C language tromped the competition being nearly free at universities int he 1970s and 1980s. The just charged distribution costs. Ditto C++ over ObjectiveC. ObjectiveC had the advantage of commercial support and more Smalltalk-like roots than C++. But only Apple uses it.
In mos operating systems Java is free.

I really can't agree with that. Which large companies pushed perl, ruby, or python? Those who pushed were not large by the standards of the global IT market. It was the fact that many smaller companies and development houses got on board, seeding the market with programmers who knew the new languages, that made them successful.

And even the "successful" ones have had limited success. For example, show me a non-web application that was developed with Ruby and not using Rails. Now granted, the libraries and frameworks of a language (like JEE) have a great deal to do with their acceptance by the industry, but I think it speaks volumes about the supposed benefits of some of these languages that they went no where until someone was fanatical enough to write framework libraries using them.

In a sense, the role of the language itself seems to have shifted to the lower levels of the machine, almost assembly-like. In the meantime, the application framework has become the new "programming API" of the language library, rather than the boring and basic string and math functions that used to comprise language libraries. People now drop out of the framework into custom code only when they are forced to, with the bulk of the coding being more in the use of annotations and tags to tie pieces of the application together automagically rather than having to be expressly coded with multiple lines of low-level code.

Of course this all comes at a price. The more you rely on things like tags and annotations, the more your code is relying on introspection and adaptive code, which is inherently slower than code which was written specifically for the attribute accessors and data types being manipulated.

Worse, some of the framework libraries I've seen make the horrendous mistake of completely ignoring the protocols and communications styles used by legacy code. If you're going to succeed in the business arena (where most coding is done), you HAVE to deal with those old systems, and that means making it easy to deal with EDI transforms as well as XML based IOs.

By no means am I arguing that we don't need specialized languages for special purposes in the overall application stack. Tools like Ruby on Rails are needed to simplify work in their slice of the system pie. But I can't see there being another "big thing" like Java or C# any time in the near future, but rather the continued evolution of those languages.

Another factor is that people get tired of playing with new languages when they don't take off, and that speaks volumes to their fitness for a purpose. Languages like C++ took over a decade to really catch on, but their ideas were novel enough that the early adopters stuck with them and kept using them while momentum built. Nowadays if you don't have significant mindshare within a few years, people seem to give up and move on to something else/better. Were these languages really a significant improvement, their fans would stick with them and promote their use despite their unpopularity.

Really, had Lisp been more widely taught, we would be talking about (incf Lisp) rather than C++ when we argue about programming languages. What is taught in school affects programmers' choices about languages and designs more than anything else. Most schools today teach C++ and Java; is it any surprise that these languages or very similar languages are commonly chosen for new projects, even where there are other equally valid choices (say, Clojure instead of Java or OCaml in

It works both ways though. Many schools try to teach what's relevant in the workplace. Although there are some more "academic" institutions that will focus a lot on things like scala, lisp, haskell, and others, many schools will try to through in a few courses where you're using "industry" languages because they want their students to be able to get jobs afterwards.