Posted
by
Soulskillon Wednesday August 08, 2012 @04:35PM
from the zero-to-standard-in-5.4-months dept.

theodp writes "Back in the day, getting traction for a new programming language was next to impossible. First, one needed a textbook publishing deal. Then, one needed a critical mass of CS profs across the country to convince their departments that your language was worth teaching at the university level. And after that, one still needed a critical mass of students to agree it was worth spending their time and tuition to learn your language. Which probably meant that one needed a critical mass of corporations to agree they wanted their employees to use your language. It was a tall order that took years if one was lucky, and only some languages — FORTRAN, PL/I, C, Java, and Python come to mind — managed to succeed on all of these fronts. But that was then, this is now. Whip up some online materials, and you can kiss your textbook publishing worries goodbye. Manage to convince just one of the new Super Profs at Udacity or Coursera to teach your programming language, and they can reach 160,000 students with just one free, not-for-credit course. And even if the elite Profs turn up their nose at your creation, upstarts like Khan Academy or Code Academy can also deliver staggering numbers of students in a short time. In theory, widespread adoption of a new programming language could be achieved in weeks instead of years or decades, piquing employers' interest. So, could we be on the verge of a programming language renaissance? Or will the status quo somehow manage to triumph?"

I had a lame True BASIC course, which I just skipped until the final week. The prof was nice enough to let me just turn in all the coursework and pass the final. I thought that was cool. True Basic, not so much.

I'm plowing through the python course on Udacity now, despite being pretty comfortable in python. You never know what you've been missing, and it's really well done. Working on a useful project over the whole course is good. Working at your own pace is even better. Oh, and I enjoy the little c

Not only that, but Lisp was widely taught in universities and remained in the AI grad student backwater for decades. In fact, it wasn't until after universities decided to drop Lisp in favor of Java that it started to enjoy a renaissance, though the timing is probably coincidental.

There are a whole host of reasons companies decide to use one language over another. What's being taught in universities isn't even on the list.

I see three things that can drive new languages into the forefront, and none of them are Universities, which are always behind the curve or off center entirely (I still remember my courses taught in ADA. Ugh.)

These three things are:

1) A robust implementation that offers ease of use for features that have historically been tricky to use, e.g. real-time, concurrency.2) A language home-grown in a particular ethnic region, such that the user base primarily speaks the local non-english language and educational

There may have been some places that taught basic, other than Dartmuth, but in the University that I went to the choices were, for a first language, FORTRAN.

For a follow on you could choose Algol-60 (pretty much, it was a local implementation) or Snobol. Assembler was also available, but not recommended...and you had to take it as an Electrical Engineering class.

OTOH, I was in the Math department. The business school might have had a class in Cobol. If so I never heard of it. Electrical Engineering was

Projects use languages, projects need employees, and employees need proven credentials. Inertia will continue to be a huge component of language selection for decades to come. Ruby is the last language to make progress without an already big tech name pushing it and it's already more than a decade old.

That's the problem with IT. If HR did chemistry hiring like HR does IT hiring we'd hear stories about people being underqualified because they used 50 ml beakers at school instead of 75 ml beakers at $job. Or "You used 2-propanol? Sorry we only hire people who use isopropyl in that synthesis."

Except the company uses exclusively isopropyl throughout their entire chemical processing chain and switching to 2-propanol is a multi million or billion dollar effort that would take years to complete and many more years to show a return on the investment. Not to mention the environmental impact studies that would need to be performed to determine if any byproducts would harm the local environment and also testing to make sure the output did not have any adverse effects.

Using the latest and greatest process, tool, widget, language, whatever, is not always a good idea. Sometimes you should live by the motto "if it's not broke, don't fix it."

The problem isn't with a company wanting to do things a specific way, it's with the assumption that people who've done something that's very similar but not exactly the same can't adapt.

Ummm... Small point, but 2-propanol IS isopropyl alcohol. So, no billion dollars to make the switch

And therein lies the joke, I ( a non chemistry buff) can quickly Google 2-propanol and see that they are one in the same, yet a normal automated HR screening process will kick one and accept the other. Kinda sad when you have a human resources check list without humans in it hey?

The real problem is libraries. A language could get a critical mass of followers without ever involving those who are planning ahead for a career in business, but you've got to be able to do things.

For this reason, I give Vala a decent chance if they can ever get to a 1.0 version...and if Gnome doesn't backstab them by changing all the libraries. (This wouldn't keep Vala from using them, but would so fragment the userbase that it could never get a large enough community.)

Scala does, indeed, have its fans. So do many other minor languages. I don't think one could call it a successful language, however.

For that matter, I'd be as likely to favor Clojure as Scala. But it's also a minor language with a following. Or Groovy. Note that I'm mentioning languages that can piggy-back on the Java library system. This is an important commonality. I'm unwilling to invest time or effort in them, because I don't trust Oracle, but that *is* a significant strength, unless it turns in

Universities start teaching their students languages AFTER they become popular. Java was well established in industry and universities were still teaching Pascal as a first language (an excellent choice), then C. THEN they switched to teaching Java as an intro language. The students who first learned it wouldn't have had an effect on industry for another two to four years after that.

Languages get adopted by individuals, then get used in industry, THEN get taught to students.

My experience has been education is always a generation or two behind.

When I was a young pup, we all wrote C or pascal but the schools taught BASIC and FORTRANLater on we were writing in java, but the schools taught C++. I slogged thru "detil and detil C++" or something like that. Pink cover as I remember.School taught 68hc11 assembly language, which is a great education, but poor training as supposedly everyone does microcontrollers in C, or at least the people that talk loudly do, I donno what people wh

Java is slowly being adopted and has met great resistance at my college. I have seen that colleges are at least ten years behind in just about everything. Some things are even further behind by 15 - 20 years. Sometimes they have no idea that the tech they are behind in even exists.

Adopting java over what? If it's C they should stick with it at this point C is the mother of most modern programming languages. University level should not be a trade school, C teaches all the the constructs and all of the power. Java is a subset of C at best and hides a lot of complexity that a programmer needs to understand.

Yes. And I think it's a good thing. Universities are supposed to be about education, not training. If you want training, go to a tech school.

Universities teaching something that's not the latest hot industry language means that students will learn at least a couple of languages and hopefully in the process learn how to learn languages, rather than being a trained drone.

In undergrad I learned (officially) Pascal, C, C++, Java, Prolog, x86 assembler, Motorola assembler, a couple varieties of Motorola microcontroller assembler, VB, Perl, PHP, Javascript and a bunch of things some people might call programming languages like HTML, XML, SQL, etc. Oh, and built and programmed machines (using both wires and simulation) that ran on my own machine code and assembler definition.

Now I hear people complaining bitterly about having to learn a new language.

Try and get a wet behind the ears 20 something to learn x86 assembler well enough to do inline assembly optimization better than the compiler. Most cry like a stuck pig. Far to many programmers think that throwing some business logic around a pile of libraries constitutes coding.

Which is why we need to stop the training-ification of universities. The average code monkey probably doesn't need to know assembly, and shouldn't be forced to learn it. But computer scientists and more skilled computer related positions benefit from knowing how computers actually work.

Yes, and the other day I was reading that we shouldn't bother to learn algebra, because c'mon! - how often do you use it? Well, if you don't know it I'm sure you won't be using it at all, so Q.E.D.

Having at least passing familiarity with an assembler language is valuable whether you use it directly or not, because it gives you a sense of what's going on "under the hood" in CPUs, GPUs, devices, etc.

As a hiring manager, I wouldn't even look at you.

As a hiring manager, I might indeed look at you as a code monkey for simple tasks, but your obvious disinterest

School taught 68hc11 assembly language, which is a great education, but poor training as supposedly everyone does microcontrollers in C, or at least the people that talk loudly do, I donno what people who actually write code do.

When I learned about Microcontrollers in my EE course, we were first taught how to code in assembly language, and only later taught C. I think the idea is that the learning of assembly language can help the students to think about the inner workings of the chip (i.e. moving values into registers etc.), so teaching assembly language is a good first step before moving onto C.

While that's basically correct, I feel that modern assemblers are too complex. The M68000 was the last decent assembler to learn on, but even that was too much. 7094 assembler was a much better choice, but honestly, probably the best choice is MIXX. Or possibly Parrot.

Remember, you aren't learning the assembler to write productive programs, you're learning it to learn how the mechanics works, and to get an idea of efficiencies. For that simpler languages have real advantages. MIXX has the additional v

Sure. But not because they're taught. Like my first day of grad school when my supervisor told me "I heard about this language called Python. It sounds cool. You should learn it and then teach the rest of us."

Universities start teaching their students languages AFTER they become popular. Java was well established in industry and universities were still teaching Pascal as a first language (an excellent choice), then C. THEN they switched to teaching Java as an intro language. The students who first learned it wouldn't have had an effect on industry for another two to four years after that.

Languages get adopted by individuals, then get used in industry, THEN get taught to students.

As someone that took CS 101 in '98, I should tell you that Java was the language taught Freshman year. I only stuck with CS for about 2-3 semesters (they were electives FYI:) -- but I'm still programming for work and leisure. Anyway, Java may have been "popular" but it was still 1.0-1.1 before Swing came out, and by that, and hindsight, I mean it was a total mess of shit that you couldn't get real work done in.

I still have nightmares about how I spent HOURS trying to figure out that a "deprecated" warnin

universities were still teaching Pascal as a first language (an excellent choice)

I'm sorry this is a little of-topic, but one of these days, someone is going to have to explain to me why Pascal is such a great first language. In my experience, it's far too strict (strongly typed, rigid structures, etc) for beginners. On top of that, it's not nearly powerful enough for experts. Why it caught on, at all, is completely beyond me.

Personally, I think BASIC was a far better beginner's language. You only had to wo

Universities do not and should not be teaching programming languages. They teach programming, the general practice. They teach the theory behind programming. They teach math. And they may teach "Programming Languages" as the study of the languages themselves with examples of real languages. But they don't teach "Python 101" or "Introduction to Haskell." A CS student is expected to be able to pick up whatever language needed given instruction in that general type of language (broadly imperative, function, and logical). A given professor may require a specific language because it's convenient to have everyone working in the same language and easier to grade that way, but that need not be what the text uses for the same topics. Indeed, the majority of texts use pseudocode that isn't in any "real" programming language.

Good luck on that. Programming has become very fashion conscious in the last decade or two. Programmers have also become more technician like in that they want high demand skills only that get them a short term job quickly.

It's almost like that except they teach data structures, object-oriented programming, and other idioms that are useful to both academics and industry. They don't exactly teach you "programming languages" except maybe when you're taking compilers, and in that case it's more than the language itself, it's how to design, gramatically specify, and "compile" one language into another language (I'm taking educated guesses, I haven't taken the course but I've studied gcc).

Whether gender in natural languages is a bug or a feature depends on context. In many traditional contexts it's a feature. In others it acts to reduce the error rate of a noisy signal. But there are times when it just increases the amount of information that must be transmitted, without increasing the amount of useful information. In that case it's a bug.

And also remember that the information that others deem significant in a particular context may not match that which you deem significant. So you may

This is what I don't think the "you don't need a degree" crowd gets. University instruction is teaching how to be a good programmer, not how to write code. If you think college was worthless because you didn't learn the DirectX API, consider that your university experience should have been about everything that happens before the first character of code is typed; and then some.

I agree that the "you don't need a degree" crowd doesn't understand often times the foundations or theory. However, the "you need a degree" crowd goes around saying things like:

University instruction is teaching how to be a good programmer

When many of us have met countless folks with CS degrees who are horrible programmers. And don't misconstrue what I'm saying here, the foundations and theory are extremely important, I am not speaking of them with any form of sarcasm. Those who do have great comprehension of theory are all the best developers I've known.

Especially ADVANCED degrees. When I was hiring, we never found a Masters or PhD that was worth hiring. Too much theory, no real world. One person with a masters didn't know what the Start button on Windows 95 did. Another couldn't add one column to a csv in 8 hours!

True story: a friend of mine was pursuing his PHD in CS. During that time he was a TA in an object oriented programming course. He was responsible for grading lab-work (done in Java) and was telling me about various assignments he would have to mark. He told me that many of these students would find rather creative ways to complete the assignment without using any object oriented principles what-so-ever. What he was handed was akin to scripts inside a single main() method.

Uni instruction teaches you shit all about being a good programmer in my experience. data structures don't take 3 years to learn. we've got a guy who's been doing part time uni working on some of our systems, last week he learned how to use constructors. It's August. Note that i said *use* not *write*. Not to mention when he comes back with some bullshit a lecturer seems to have just made up and we have to tell him to just forget it after the exam.

I don't even know if that is true. Of course, I took CS courses something like 15 years ago, but back then you didn't really even learn to program, you mostly learned what went into making computers work. Good coding practices or even solving issues with coding itself wasn't even really the point. You learned how computers might solve the problems, and you might then translate the solution into code using the coding language of the professor's choice.

I completely agree with this. When I went to school we were introduced a topic and a language to demonstrate the topic. The purpose of the course was to teach programming concepts, not languages. I can still remember some of my course titles (Structured Programming with Pascal, Data Structures with Pascal, Systems Programming with C, Object Orientation with C++, GUI Programming with Delphi, AI Programming with Lisp and Prolog, and so on). The languages weren't always ideal for the course (Smalltalk woul

Universities do not and should not be teaching foreign languages. They teach reading, the general practice. They teach the theory behind writing. They teach grammar. But they don't teach "French" or "German".... oh wait, no, the opposite is true, I wonder why that is?

Yeah, they told me even in CS 1 that we were only using C++ because they had to use 1 language so that everybody could use the same book and have equivalent grades. We were also told to try everything out in other languages on our own time if we could.

It's far better to teach someone, even a biologist, basic concepts and then point them towards a {$PROGRAMMING_LANGUAGE} tutorial or text than it is to teach them a particular language. I teach a Programming for Science Graduate Students (mostly neuroscience) class that does just that. I strongly encourage them to check out Python, but lots of them end up using MatLab.

I am an electrical engineer. I learned C++ in a single course in college (my only CS course), but I later "taught" myself Java (or just enough Java to get the job done, and haven't needed it since). It is true that the fundamental skills in programming are understanding what instructions you are tying to give to the machine, avoiding typos, and debugging your code when it doesn't work. The basic premises don't change much, such as IF-THEN statements or calling out subroutines. A key skill is to define w

In theory, you should be able to give your flow chart or psuedocode to any programmer of just about any language and they should be able to write the code for you - or vice versa if someone gives you psuedocode you should be able to code in the language you use.

please show me a flow chart for a kernal, or a modern 3d game engine complete with ai.now write it in basic...

Don't count on it. Most people are like me in selecting a course. They want relevant skills. If a course that might otherwise tickle my fancy requires learning B+- or Anchovy_Paste.net I'll keep looking. There's a lot of selection out there now and I have little time for picking up languages on speculation.

Yeah, like nobody ever learned LISP, PASCAL, BASIC, Eiffel, Erlang, Haskell, LOGO, or Scheme before there was an internet... Plenty of languages have flourished in academia without having broad industry support. Some exist primarily as teaching languages, others are most appropriate for domains where there's not a lot of practical economic application yet.

For fucks sake, stop with the thinly veiled advertising. We're talking about a huge penetration of languages like C, C++, Java and Perl and the like which are still going to require people capable of coding in them. This fucking online Khan Academy crap isn't going to change that, and I'll wager you dollars to donuts the whole fucking thing will collapse under the weight of insanely over-hyped promises and gimmicks.

Non programmers need to understand that the language isn't the problem. Certain autistic persons have issues formulating sentences to communicate properly to those that are well versed in communication. It doesn't matter if they learn 10 languages, if they can't convey their thoughts in one language, they aren't going to do it in another language.

Likewise, with programming, if you can't speak the language of logic, then you can't program. If you can't have the forethought to see holes in logic, then you can't program. Sure, you can write up some stuff that works. But it still isn't coherent in the grand scheme of things. The government, Universities, and corporate management seem to be stuck thinking that we just need more people that know certain programming languages.

When will they learn that programming is a shift in the thought process that a large segment of our population just can't make? Or they won't make unless we start teaching people to be logical and non-ambiguous in life...

When will they learn that programming is a shi&t in the tho#ght process that a large segment o& o#r pop#lation j#st can't ma@e??

Most people can realize what can be automated. That's why most people don't like repetitive tasks, that they know could be automated. That's how a lot of so called progress has happened.

but you're forgetting that most things don't need to be perfect to work. if an automatic sorting machine just does an OK job that might be enough, considering that a human might not be able to do any better judging if some apple is red enough or not. That's a problem case where there is no definite logic on what's passable.

Khan Academy is great, but I agree with you that it isn't going to change the landscape of programming languages. Online language courses are everywhere, and have been for as long as the internet has been available to the masses.

"Most people can realize what can be automated. That's why most people don't like repetitive tasks, that they know could be automated. That's how a lot of so called progress has happened."

My experience is exactly the opposite. When I'm in an intro computer course and say, "If you ever find yourself doing a repetitive task on a computer, then you're using it wrong; try to find a hotkey, or a script, or a batch process, or think if you can program something to do it instead", they look at me like I have two h

"getting traction for a new programming language was next to impossible. First, one needed a textbook publishing deal. Then, one needed a critical mass of CS profs across the country to convince their departments that your language was worth teaching at the university level. And after that, one still needed a critical mass of students to agree it was worth spending their time and tuition to learn your language."

That is not the way it was. I've been programming professionally since the 1970's. We didn't go to school to learn a programming language. If you took classes it was to learn techniques and concepts. Picking up a new language is a trivial thing. Taking a course on a language does not make you a programmer. Language is merely a way to communicate with the computer. New languages and development environments come and go. Good programmers persist and pickup new languages easily to do the tasks needed.

Nobody will learn a new language unless it offers a big advantage over the existing popular languages. In the last 2 decades, that has meant having a particularly useful library or framework (such as CGI for Perl or Rails for Ruby). Why else would anybody invest the time. New languages are a dime a dozen (actually, that's too generous).

I'd add new paradigms as a third reason. OO seems to have driven a lot of language choice. I'm betting functional is going to make a splash in the future; after all you can't spell functional without "fun". Either that or massive parallelism is going to force Erlang down our throats like it or not.

If a new paradigm doesn't hurt, you're doing it wrong. I admit that years ago when I was just starting ruby my code looked a whole hell of a lot like perl, because I was doin it wrong. I'm old enough to remember the annoying switch to, and then mostly away from, OO.

Nobody will learn a new language unless it offers a big advantage over the existing popular languages.

I don't think that's true. It just has to be different enough.

The thing is after you use a language for a while you know it's flaws. It's at that juncture that some other language can come along and capture your fancy, all it has to do is address those flaws you find most annoying in a saner way.

The frameworks are kind of a precondition these days though, if you try to work a string over and encounter pai

With Facebook seemingly half-populated by bots, are these numbers thrown around by these "online universities" really a reliable source? And how many "certified" IT people have you dealt with who were totally clueless?

A person who learned to program a computer should be able to use, after a time, any language that is currently in the top 5, that is C, Objective C, Java, C++, C#. All of these share a common underlying philosophy and basic technique. C is by far the simplest fo these, but also the most basic. Of course most of the time the language is not the thing. Rather the API is where th heavy flitting is done and in fact over the past 10 years of so has become a barrier to entry. One has to know how use.net., or the interfaces for iOS or android. 30 years ago the APIs were not this arbitrarily complex, but also we were not doing threading of complex UI. Specifically when using an API one has to define the solution to the problem in terms of the API, not the language. These language account for perhaps 2/3 of the computer applications.

The second tier stuff if most useful for RAD. That is visual basic, python, perl, PHP, Ruby. These are mostly scripting languages, and require a slightly different approach. The solution is defined in terms of the capability of the language and the available scripts. This is particularly true with Ruby. These are languages that meet specific requirements for specific purposes. For instance PHP and Ruby are what uses to write a website. Python is quite popular for home grown science applications.

Which is to say that anyone trying to promote a language because it is what they know rather than because it is what is used to solve a particular problem is like a person trying to get their boss to buy a lather for the server room because they really need a lathe for home projects. I would not try to script a website with C. I would not try write a data analysis program in assembly. The computers are simply too fast and we have had 40 years of development of tool that means we do not need to spend a quarter and a million dollars rewriting a GUI. This has always been true. In the 80's we used fortran for number crunching because that was the only language supported by IMSL. We used C for everything else because it ran on everything else.

So online learning is only going to teach students how to use useless tools. Yes I would like to teach people how to use Forth, but what is the point? We can teach students how use Shakespeare, and it would teach them techniques they need to know and would be very motivating for certain students, but where would they use it? Once a student is proficient at programming, and understand the basic concept, time needs to be spent on learning how to to efficiently acquire API knowledge

Some people like to talk about computing without knowing its history. How did this made it to the/. front-page news?

"Back in the day, getting traction for a new programming language was next to impossible. First, one needed a textbook publishing deal.

Yeah, because COBOL and FORTRAN only took off after a mass of publishers got on it. Riiiiight.

Then, one needed a critical mass of CS profs across the country to convince their departments that your language was worth teaching at the university level.

Counter example: COBOL, FORTRAN, C, Java (the later two only took off after the industry was using them a plenty.)

And after that, one still needed a critical mass of students to agree it was worth spending their time and tuition to learn your language. Which probably meant that one needed a critical mass of corporations to agree they wanted their employees to use your language.

Where the hell do you get this stuff. Are you still in school or something?

It was a tall order that took years if one was lucky, and only some languages — FORTRAN, PL/I, C, Java, and Python come to mind — managed to succeed on all of these fronts.

FORTRAN took off because it was the best thing at the time for programming (much better than COBOL.) Java took off without the need of publishers or academia. It was simply taken by the industry. Python hasn't taken off (I love the language, but its usage is nowhere near Java or C#.)

But that was then, this is now.

You don't know what was "then". I doubt you know what it is "now".

Whip up some online materials, and you can kiss your textbook publishing worries goodbye.

What does this even mean?

Manage to convince just one of the new Super Profs at Udacity or Coursera to teach your programming language, and they can reach 160,000 students with just one free, not-for-credit course.

Yeah, because it will be as easy as it was before, right, right, right? Let's build a pyramid of hypotheticals!!!!

And even if the elite Profs turn up their nose at your creation, upstarts like Khan Academy or Code Academy can also deliver staggering numbers of students in a short time.

Yeah, because if up-start elite professors at Udacity or Coursera turn up their noses at your pet project, Khan will surely pick it up. Khan!!!!!!!!

In theory, widespread adoption of a new programming language could be achieved in weeks instead of years or decades, piquing employers' interest.

Because business rely in internet popularity and nothing when investing in effective technology.

So, could we be on the verge of a programming language renaissance?

I didn't know where were in a programming language dark age.

Or will the status quo somehow manage to triumph?"

Somehow this reminds me of Dora the Explorer when she stares at the audience waiting for an answer.

Unless the language adds something revolutionary or is very domain specific, we don't really need anymore widely used programming languages. What we do need is more libraries, frameworks, and APIs for existing languages. Preferably, they would be open source or at least have open specifications so that an open source version can be made. Also, not all problem domains warrant their own language.

Good grief man! One of the more popular languages around these days is Objective-C! Would you have thought THAT possible ten years ago?

Look at StackOverflow, brimming with questions about Ruby or Python or PHP or Scala.

Look at alternative databases in wide use today that do not use SQL.

Your renaissance has already arrived, any language that has some good practical use does not need a course to gain adoption, just a tag in StackOverflow and a handful of fervent believers to evangelize the use of it.

On a side note, it's depressing the number of dour replies you got right out of the gate. There was a time where futurists were a healthy part of Slashdot, now we are scored and ridiculed. It hardly matters though since we are generally right in the end, so keep the spirits up.

No. Programming languages need two things to become mainstream. First they need a very extensive library of support such as windowing, network, and about 50 other topics. Second they need a compelling reason to use the language itself. The compelling reason could be that the language is so nifty or elegant that it is worth the effort. In procedural languages it is hard to imagine anything better than what we have. In non-procedural languages there may be some new ideas yet to be thought of. Another com

Sure, online sources mean a lot of people will get to write "Hello World\n" in many different languages - but so what?

Most of those languages will wither on the vine as there is no widespread support for them, no major pieces of software written in them and the skills base is so dilute (10 million "users" spread across 7 billion people? sounds like homeopathic programming - even if they are all connected on the internet) that it's in no employers interests to invest in it.

Programming languages become popular when they come attached to something else that is already popular---and for reasons independent of programming languages. In a nutshell, connecting to operating system facilities which are connected to popular hardware.

If Apple iPhones were programmed in object COBOL, the language would be popular. And after all, few people used Objective-C outside NeXTSTEP/MacOS/iOS. If A

Seriously. Considering the amount of bitching, griping, moaning and whining I've seen about businesses failing to move to new operating systems and carrying around large amounts of legacy code, it doesn't appear that there's a pent-up demand for brand-new languages. The OP seems to be operating under the assumption that "if you build it, they will come" when it comes to programming languages, but the real world seems to be of a different opinion.

We can only hope. Aren't we already getting a language a week, some FORTRAN with capricious syntax changes, others FORTRAN with horrific kluges grafted on? (actually, all with capricious syntax changes).

I don't know. Maybe if there were some languages that broke new ground in terms of data abstractions, control flow, basic concepts of how to program, etc., there might be some reason to adopt them.

The last twenty years of language design has simply been a rehash of the twenty years before that. There hasn't been anything interesting out of the programming language world since CLOS and its multi-methods and MOP back in the early eighties. Maybe Erlang's process model from the mid-eighties. And the academic p

...one still needed a critical mass of students to agree it was worth spending their time [and tuition] to learn your language.

Okay, we can remove all barriers except this one. How can you convince a critical mass of people to agree that it is worth spending their time to learn your language? Investing time is really more critical than all the others put together.

Why don't a critical mass of people to spend time learning Tuvan throat singing? Because they aren't interested and/or they don't think it's worth their time. Why do many people spend their time learning english? Because they perceive that is is worth their time.

If the lack of death of the relational database compared to better alternatives is any indication, people are simply stuck in their old habits, and no more so than corporations. Of course, corporations want to know their technology will be supportable in decades to come too.

Do we actually need a renaissance in new programming languages? We've got already a ton of interesting ones which have never been widely used. Scheme, Erlang, Haskell, Sather, and who knows how many more. When we start seeing these widel

First, because never was that easy to publish a book. On Amazon and Lulu, one can just submit pretty much anything and sell, no matter how crap it is. On more traditional publishers, people like Versita and De Gruyter has options for publishing peer-reviewed, high-quality books, essentially for free (taking the payments from the sales, instead of the author in advance). On Versita, one can even let the books be accessed for free for the PDFs, while a printed copy would cost.