Thursday, May 7, 2009

1801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry. Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization.

1842 - Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn't have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML.

1936 - Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them.

1936 - Alonzo Church also invents every language that will ever be but does it better. His lambda calculus is ignored because it is insufficiently C-like. This criticism occurs in spite of the fact that C has not yet been invented.

1940s - Various "computers" are "programmed" using direct wiring and switches. Engineers do this in order to avoid the tabs vs spaces debate.

1957 - John Backus and IBM create FORTRAN. There's nothing funny about IBM or FORTRAN. It is a syntax error to write FORTRAN while not wearing a blue tie.

1958 - John McCarthy and Paul Graham invent LISP. Due to high costs caused by a post-war depletion of the strategic parentheses reserve LISP never becomes popular[1]. In spite of its lack of popularity, LISP (now "Lisp" or sometimes "Arc") remains an influential language in "key algorithmic techniques such as recursion and condescension"[2].

1970 - Guy Steele and Gerald Sussman create Scheme. Their work leads to a series of "Lambda the Ultimate" papers culminating in "Lambda the Ultimate Kitchen Utensil." This paper becomes the basis for a long running, but ultimately unsuccessful run of late night infomercials. Lambdas are relegated to relative obscurity until Java makes them popular by not having them.

1972 - Dennis Ritchie invents a powerful gun that shoots both forward and backward simultaneously. Not satisfied with the number of deaths and permanent maimings from that invention he invents C and Unix.

1972 - Alain Colmerauer designs the logic language Prolog. His goal is to create a language with the intelligence of a two year old. He proves he has reached his goal by showing a Prolog session that says "No." to every query.

1973 - Robin Milner creates ML, a language based on the M&M type theory. ML begets SML which has a formally specified semantics. When asked for a formal semantics of the formal semantics Milner's head explodes. Other well known languages in the ML family include OCaml, F#, and Visual Basic.

1980 - Alan Kay creates Smalltalk and invents the term "object oriented." When asked what that means he replies, "Smalltalk programs are just objects." When asked what objects are made of he replies, "objects." When asked again he says "look, it's all objects all the way down. Until you reach turtles."

1983 - In honor of Ada Lovelace's ability to create programs that never ran, Jean Ichbiah and the US Department of Defense create the Ada programming language. In spite of the lack of evidence that any significant Ada program is ever completed historians believe Ada to be a successful public works project that keeps several thousand roving defense contractors out of gangs.

1983 - Bjarne Stroustrup bolts everything he's ever heard of onto C to create C++. The resulting language is so complex that programs must be sent to the future to be compiled by the Skynet artificial intelligence. Build times suffer. Skynet's motives for performing the service remain unclear but spokespeople from the future say "there is nothing to be concerned about, baby," in an Austrian accented monotones. There is some speculation that Skynet is nothing more than a pretentious buffer overrun.

1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic.

1987 - Larry Wall falls asleep and hits Larry Wall's forehead on the keyboard. Upon waking Larry Wall decides that the string of characters on Larry Wall's monitor isn't random but an example program in a programming language that God wants His prophet, Larry Wall, to design. Perl is born.

1990 - A committee formed by Simon Peyton-Jones, Paul Hudak, Philip Wadler, Ashton Kutcher, and People for the Ethical Treatment of Animals creates Haskell, a pure, non-strict, functional language. Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that "a monad is a monoid in the category of endofunctors, what's the problem?"

1991 - Dutch programmer Guido van Rossum travels to Argentina for a mysterious operation. He returns with a large cranial scar, invents Python, is declared Dictator for Life by legions of followers, and announces to the world that "There Is Only One Way to Do It." Poland becomes nervous.

1995 - At a neighborhood Italian restaurant Rasmus Lerdorf realizes that his plate of spaghetti is an excellent model for understanding the World Wide Web and that web applications should mimic their medium. On the back of his napkin he designs Programmable Hyperlinked Pasta (PHP). PHP documentation remains on that napkin to this day.

1995 - Yukihiro "Mad Matz" Matsumoto creates Ruby to avert some vaguely unspecified apocalypse that will leave Australia a desert run by mohawked warriors and Tina Turner. The language is later renamed Ruby on Rails by its real inventor, David Heinemeier Hansson. [The bit about Matsumoto inventing a language called Ruby never happened and better be removed in the next revision of this article - DHH].

1995 - Brendan Eich reads up on every mistake ever made in designing a programming language, invents a few more, and creates LiveScript. Later, in an effort to cash in on the popularity of Java the language is renamed JavaScript. Later still, in an effort to cash in on the popularity of skin diseases the language is renamed ECMAScript.

2003 - A drunken Martin Odersky sees a Reese's Peanut Butter Cup ad featuring somebody's peanut butter getting on somebody else's chocolate and has an idea. He creates Scala, a language that unifies constructs from both object oriented and functional languages. This pisses off both groups and each promptly declares jihad.

Objective C is nice but it's not considered an influential language. It's only used on the Apple platform and nothing else. It didn't have anything that could inspire a new language designer, because in the end its object model is just a light smalltalk layer over C.

1968: Chuck Moore gets a sweet gig at the National Radio Astronomy Observatory. Realising his poor hygiene and complete inability to speak any human language would inevitably lead to unemployment, he invents Forth, a language interpreter with no features at all and even less syntax than Lisp, in which he rewrites all the observatory's software in a week so that it runs ten times as fast and is totally unmaintainable by anyone but him. Geeks surprise him by taking the language and running with it, making N squared incompatible implementations where N = the number of users. They do this because Star Trek: The Next Generation is still ten years in the future and nobody has invented the Klingon language yet.

PHP: Began as a simple way to include a few elements in a web page. Syntax slowly evolved using a sophisticated voting system by the masses and also tacking features on in a way described as "willy nilly". Later PERL developers flocked to the PHP, dissatisfied with the complexity of PERL. They solved this problem by adding millions of functions that all followed different naming conventions and did not always do what the name suggested. For instance: serve_up_a_flickr_website(); actually created a Facebook-like site. Objects were introduced late in the game but are highly unused because there are functions that'll do just about anything in PHP.

I think we should punish the Smalltalk guy for inventing an inferior and disturbing paradigm. We should also punish Stroustrup and Gosling for popularizing it. Because of them, millions of programmers have suffered and thousands have killed themselves.

Should've had something about Java being invented because "C++ is just too HARD, and omigod, did you see Billy Wilson, he is SOOOO cute and dreamy!!!" (All said while cracking gum and twirling that lock of hair with an index finger....)

Awesome! but there's a bunch of easy pickings for real stupidity you missed:RPG, XML, CSS (Which should be banned), Awk, Bash/sh, Delphi, sendmail.cf (Which in itself might qualify as a language.)and of course GROOVY! Which is just itchin' to be poked at. CMON' the 'eggs' in python are cracked, ruby gems don't sparkle unless they run on the JVM, what about assembly? Didn't we all write for the Motorola 6800?

The comment on PHP is lacking the beginning: In 1994, a guy named Rasmus Lerdorf, who hates programming, printed out an HTML page and shook the printout so hard that pieces of it fell off (this is known as the "shaken page syndrome"). Then he let his dog chew on the detached pieces. He glued those pieces back and typed the result into the computer.

Previously mentioned you missed Forth which has lead many to a Fifth to help forget about it and taking to the Fifth to deny all knowledge. Only to be perverted into postscript and then PDF. Also, it is of particular celestial importance controlling telescopes and the Sun (Sun monitor is Forth).

James invented Java in the spring of 1991, the general idea being that he disliked C++ and everyone else disliked his previous effort, Object-Postscript [from NeWS], because of the RPN syntax, so he rewrote the NeWS interpreter with a c++ like syntax.

If you haven't seen it yet, Dick Gabriel and Guy Steele have a talk called "50 in 50", where they discuss 50 programming languages in 50 minutes. I think they originally wrote it for the History of Programming Languages conference, but they've since done it at OOPSLA and JAOO. I found a video for the JAOO one here:http://blog.jaoo.dk/2008/11/21/art-and-code-obscure-or-beautiful-code/

Anonymous wrote:> Objective C is nice but it's > not considered an influential > language.

Unless you consider that *millions* of iPhones are running it, and you can literally make hundreds of thousands of dollars in a few weeks if you use Objective C to write the next popular app in the iPhone Store.

But aside from the millions of people using Objective C on their iPhones every day around the world, nobody really uses it.

1986 - Joe Armstrong implements Erlang which is later open sourced. Erlang is the first language to properly execute a bullet-proof thread safe hot updating seamlessly distributed super-scalable VM which is destined to rule the web, but it is the very last language to achieve x86 SMP support. Unfortunately this is not a joke.

1967 - IBM forges PL/I in the halls of Armonk to reinforce their stranglehold on the computing industry. Despite its promising future, the language is banned under the Strategic Arms Limitation Treaty of 1972.

1986 - Bertrand Meyer founds the first Church of Eiffel based on the principles of object-orientation, design by contract, and flagellation.

0000 - and GOD said let there be Light, and Assembly language sprang forth from the blackness and GOD looked upon what he had created and it was good. And then the darkness spat forth the curse of high level languages to entangle the good and drag it into the pit.

1999 - Dissatisfied with the lack of portability and abstraction in hardware design, a consortium of Electronic Design Automation companies create SystemC, allowing software engineers to design hardware and EDA companies to make lots of money selling separate compilers for CMOS and TTL based designs. Manufacturers later complain of reduced yields because of debug symbols not being stripped, increasing die sizes.

What about occam? Invented in the early 1980s for parallel programming. Named after William of Ockham a 14th century monk famous for Ockham's Razor (at that time razors han't been invented so a razor was a witty saying)"entia non sunt multiplicanda praeter necessitatem" - in English, this is "entities should not be multiplied beyond necessity" and in American KISS (Keep It Simple Stupid). KISS was exactly what the language was, so you could actually get things done in it. For most programmers, delivering a result was far too innovative an idea, so occam failed to catch on. If it had, we would have Windows 8 by now and it would be running like greased lighning on todays 4 core processors and would need 1G RAM to boot up.

1988: in a daring escape from Hungary, Tim Osterhout defects to the West. He brings with him the Soviet's own version of Lisp. They used the cheaper square bracket [ instead of parens ( and everything is a string and there are no macros, but it's not too bad for a commie knockoff.

Let's not forget the precursors to Scheme, besides LISP and Lambda Calculus. Without the need to teach computers to understand children's stories and play with children's blocks, Hewitt, Winograd, Sussman, and so on would not have first needed to invent PLANNER, then CONNIVER, ultimately leading to what rightly should have been called SCHEMER. PLANNER was especially good at writing programs that enabled robots to efficiently fail in all possible ways, without repeating themselves (unlike humans attempting the same tasks, who typically repeat the same failing strategies at random and ad nauseum, showing that robots are indeed smarter than humans).

Late 1960's, early 1970's - Seymour Papert at MIT, with assistance from Wally Feurzeig at BBN as well as various grad students, invent LOGO, on the belief children could be tricked into actually thinking correctly if they learned LISP but it was disguised to look more like BASIC or FORTRAN syntax, given the world-wide shortage of parentheses and the need to reassure mathematics teachers that precedence of operators was something students should guess about. (Larry Tesler also tried this straegy on grownup programmers, by disguising Scheme as Dylan.)

(This isn't really a fair characterization of any of the above, of course!) -- One of the co-conspirator grad students

2009: Simon Jackson suggests that all instance variables should be method local to make assignment focused in one method. All languages decide this is a bad idea, and snoop bus logic blooms to exceed the size of a silicon wafer.

1990 - Ken Iverson and Roger Hui decide that APL is too verbose and by removing all variables create tacit J. Attempts to further reduce programming keystrokes may have been successful, but vanish into the empty spaces between words in blog posts.

Great write up :-) ... but I think your a few years late on the Java one. Even though it was distributed outside of Sun for the first time in 1995 I believe it was around for multiple years before hand.

2003 - Giancarlo Niccolai decided there were not enough programming languages around so he invented Falcon (From an earlier language named HASTE -- Haste Advanced Simple Text Evaluator). Instead of focusing on one paradigm, Falcon was built off the idea of being a poser. Niccolai created Falcon after he tried to market a more advanced version of HASTE called Haste Advanced Text Evaluator (HATE) which didn't prove popular with many IT shops.

I am pressing my thighs together in happiness. All my years of obsessive, insular study, and the sacrifice of a well-rounded personality in favour of a detailed grasp of programming paradigms, suddenly became worthwhile.

"There is some speculation that Skynet is nothing more than a pretentious buffer overrun."A correction and additions to this entry:Skynet is a buffer overrun that began to learn at a geometric rate. It became self-aware at 2:14 a.m. Eastern time, August 29th, 1997.Thankfully, the Windows 98 host went BSOD a millisecond later.Unthankfully, in Redmond, Washington, USA, deranged COM-oriented mutants survived the fallout. Their unnecessarily long reign of terror was enforced by their ability to cause derangement and confusion in the human populace.

So much brilliant stuff in this, hats off to you, sir. The C++ entry alone... And the novelty of C#.

By the way, notice how the C# spec claims C++ as an influence but doesn't mention Java, and the Java spec claims Smalltalk as an influence but not C++, and C++ claims Simula as an influence but not Smalltalk. Every language[n] seems to be influenced by that great language[n-2] but swears it has never heard of that pathetic language[n-1].

Please cover Intentional Programming, Eiffel, Forth!

Regarding Alan Turing, British Intelligence not only made him 007, they also gave him boobs, so if James Bond movies are accurate, he probably never had to leave the house.

Wouldn't the "Larry Wall falls asleep and hits Larry Wall's forehead on the keyboard. Upon waking Larry Wall decides that the string of characters on Larry Wall's monitor isn't random but an example program in a programming language that God..." one more accurately describe APL?

June 2005 - Reddit is launched, leading to the imminent creation of RedditScript. RedditScript is a statically-dynamically typed, type-inferred, purely-functional, stateful language, and the first in the line of temporo-morphic languages. Each week, RedditScript changes its features, nom de guerre and implementation, leading to new arguments regarding the validity of it for production code and the nature of the productivity gains its proponents claim. Recent and oft-repeating nom de guerres include: Haskell, Erlang, Clojure, Scala and lisps of all kinds, depending on the week. The Truth is that RedditScript is ALWAYS the best language for any job (for evidence, just see http://programming.reddit.com)

1995 - Rasmus Lerdorf is born, fulfilling an ancient apocalyptic prophecy. As it is written:

He that letteth is taken out of the way, and yet we do not realize that the Antichrist is near. Behold, for PHP is born amongst you! "Woe unto Perl", the prophet Larry Wall cries, for savage tribes in countless numbers shall overrun run all parts of this God-fearing land. Lo, for these tribes of PHP shall grow in multitude and shall lay waste upon the land of the Internet. The end is nigh!

great! BTW, object-oriented programming was invented in 1967. In an attempt to liberate the workers, Kristen Nygaard and Ole-Johan Dahl created simula-67, a Java-like language with algol syntax. It is not known if the first simula program had yet finished execution by the time Alan Key and Bjarne Stroustrup decided this was almost exactly what they wanted, a decade or so later.

I think you forgot Plankalkül by Konrad Zuse, that was published in 1948, however it was designed between 1943 and 1945. It was much more difficult as he cannot do that looking at other language invented, so it was very difficult. Zuse also designed that language for his 1st computer, that it increases his worth.

So finally he made the first high level language at least 9 years earlier than Fortran. Maybe it would be good put him in your list :-)

I believe you all are confusing scripting languages with interpreted programming languages. A scripting language is typically used to write extensions to and/or control specific programs. Programming languages like Perl and PHP and very much general-purpose programming languages even if they don't result in a native compiled executable. Perl at one time might have been a scripting language, but it grew beyond that a long while ago. Off the top of my head, things like BASH or Javascript would be good examples of scripting languages.. though Javascript is getting developed to the point where it is beginning to be arguable.

From wikipidia: "The Plankalkül was eventually more comprehensively published in 1972 and the first compiler for it was implemented in 1998."

In other words Plankalkül drowned in obscurity for close to 30 years and wasn't implemented for over 50. While it may still technically be a milestone, seeing as it had zero impact on computer science I'm not sure that I can agree that it belongs on the list -- other than to make fun of it as perhaps the ultimate fail.

Actually Dijkstra hand-wrote most of his later articles -- abandoning even the typewriter.

One omission from here is CPL (date "early 60's"?), a theoretical language invented by Christopher Strachey and which gave birth to BCPL (Basic CPL), which presaged C (via B -- oh well). The first compiler for CPL was never written, since it was decided to start the implementation after the formal specification of CPL was completed and proven correct.