Yes, discoverer. Lisp is programming. And programming is math. Math is all around us... in the tree, the rock. Math surrounds us and binds us all together. Does this mean Lisp obeys the programmer? Partially, but the will of the math works through the programmer as well.

For instance, to describe how the real numbers are somehow "complete" and contain not only algebraically calculable numbers but also transcendent numbers, there were different ideas floating around, which lead different mathematicians to invent different approaches to describe this "completeness". We have Bolzano's and Weierstrass' approach (bounded sequences and convergent sub-sequences), we have Cauchy-sequences and we have Dedekind cuts. All three were not discovered, but

How about author of Lisp? One could even say creator of. Intentor implies that Lisp is an invention which could be protected under patient law while author or creator implies that it could have been protected under copyright law .

I think you mean creator or inventor. It's not like the Lisp programming language was just sat out in the wilds of Chile under a rock waiting to be found by an archaeologist.

He was an old time computer scientist, publications with titles like "A basis for a mathematical theory of computation". Hard core math.

Philosophically, you don't "create" or "invent" math you discover it. Logical concepts exist independent of who wrote a paper about them first. Take two 256 bit random prime numbers, multiply them, and you have not "created" or "invented" the result but merely discovered it, or rephrased discovered its two factors.

Maybe, but I do not think that Euclid sat down and invented his postulates. More likely he sat down and observed various shapes and geometric properties in the world around him, and then formalized what he had observed (or perhaps many people had done so over a period of time, and Euclid wrote down the formal notions that had developed).

It is true that some areas of mathematics overlap pretty well with observation, and many of those areas had a body of math that(historically speaking) was definitely discovered empirically attached to them long before some more abstract axiomatic system special-cased them. We only have the historical record needed to prove it for some of the; but it seems overwhelmingly likely that people were using math well before they would have recognized the notion of "math", and limited capabilities for processing cer

Fair enough, but the post I was replying to seemed to claim as a matter of fact that mathematics is discovered. I was merely pointing out that the debate is still active, and that defending the "McCarthy invented Lisp!" statement needs to be backed up by more than "Philosophers say that math is discovered!"

Not quite... just looking at wikipedia for interpretations of quantum physics will give you an idea of how much disagreement can be between scientists. Or on how the little world really is: I have an acquaintance whose wet dream is busting string theory, something he says is "delusional". All science because they have scientific knowledge, yet they all disagree:).

Take two 256 bit random prime numbers, multiply them, and you have not "created" or "invented" the result but merely discovered it, or rephrased discovered its two factors.

Ah, but if you come up with a novel way to generate those random numbers, along with a novel way to store their representations for future use, then you've invented something and not merely discovered it.

Ah, but if you come up with a novel way to generate those random numbers, along with a novel way to store their representations for future use, then you've invented something and not merely discovered it.

Consider, as a counterexample, the FFT multiplication algorithm. It is based on the observation that integer multiplication involves computing a convolution, and that the pointwise product in the frequency domain is equal to convolution in the "time" domain. The algorithm is only an "invention" if the mathematics the underlie it were "invented," and so this just returns to the question of whether or not mathematics itself is invented or discovered.

Surely some parts of mathematics are discovered. I don't think e^i(pi) = -1 was invented for example. Axioms and the proofs might be invented, but the truths of those axioms and theorems are discovered. At least for any reasonable definition of discovery.

If there is any "discoverer" involved in this, it's Alonzo Church when he invented (or you could argue, found) the Lambda Calculus [wikipedia.org]. That's hardcore math. All of the functional languages, including Lisp, are inventions based upon his work.

I think you mean creator or inventor. It's not like the Lisp programming language was just sat out in the wilds of Chile under a rock waiting to be found by an archaeologist.

Actually Lisp is just one of the many languages heavily influenced by Lambda calculus [wikipedia.org] which was introduced by Alonzo Church back in the 1930s and 40s. Back then Lamda calculus it was just another system in mathematical logic that only a few mathematicians and logicians knew or cared about. So in a sense John McCarthy did find it under a rock although not in the wilds of Chile but rather in a scientific paper.

The idea is that he discovered Lisp could be assembled from seven primitive operators [paulgraham.com], from which the rest of the language could be built. Though I agree that "discoverer" is a bit of a stretch.

I think you mean creator or inventor. It's not like the Lisp programming language was just sat out in the wilds of Chile under a rock waiting to be found by an archaeologist.

Actually it was found in a cave in the Pyranees. LISP originally stood for Lost In Spanish Passageways. It was used by early cave men for catching fish. They drew it on the walls carefully concealing the syntax in pictures of Auroks and it remained totally undeciphered for approximately 200,000 years. John McCarthy wandered into a cave after having eaten some soup made from a prehistoric fungus that grows in the area. He was found days later practising tai chi in a nearby stream and went on to write the first modern day LISP interpreter.

Didn't you know? Lisp came about much the same way the Queen's English came about in Great Britain. Soon after, everyone was mandated to take a course on Lisp, further separating us from those people who argue against it's usefulness in the computing world.

Lisp can be viewed as a fancy variant of combinator logic, which is a mathematical model of computation. If you believe that mathematics is discovered, then in some sense Lisp was discovered. This may seem a a bit contrived, since one could argue that a C program is a fancy way of expressing a Turing Machine, although Lisp is a little closer to its theoretical underpinnings than C is.

Some other blog also pointed out that one of the big "modern" features tons of people rely on was invented in 1959: Garbage collection.

Say what you will about Lisp (and I'll say lots of good things about it), but practical GC has tremendous impact. Now, we just have to wait for everything else to catch up to all the other 1960s feature sets (both software & hardware).:-)

To be fair to modern garbage collectors, the algorithm that early Lisp implementations used (Dijkstra) was pretty crappy. Basically, stop the program, walk the entire heap, delete everything else. It introduced long pauses at semi-random intervals and did nothing to avoid memory fragmentation. Generational and incremental collectors made GC generally usable. I think both of these showed up in Lisp before any other language, but neither was present in 1959.

Lisp is a fascinating language with honored history in AI, but let me ask you this: is it used now in some important applications? Does modern AI software use Lisp a lot? I am under impression that it is more used in theory than in applications.

It might seem antiquated and weird to us nowadays, but emacs-lisp is actually fairly typical of the dialects of its day [stanford.edu]. It's day was just the late-70s/early-80s. Scheme and Common Lisp did a lot to modernize Lisp, and they just happened to be the first popular dialects on commodity machines so it's easy to forget that Lisp predates all computing paradigms and has given them all a shot at one point or another.

Thanks, but I would be more impressed by the list of things familiar to everyone, like somebody pointed in the other comment - EMACS.

EMACS and autoCAD. I find it interesting that in the latter, Lisp is offered as language of extension and customization. Is this the common trend of Lisp usage: language of extension and customization?

Is this the common trend of Lisp usage: language of extension and customization?

No, Lua, Scheme, and probably also Javascript have become more popular for that purpose over the years. LISP is mostly CommonLisp nowadays. It's very complete, standardized, and some CL implementations like SBCL are very fast, but CL is not very well-suited for extension and customization (at least not for lightweight one). It depends on how you define it, of course; if you include all Scheme dialects and non-standard LISPs out there LISP is definitely alive and used a lot.

It's not just the language that is important, it's the contributions Lisp made to programming language theory: "if", higher order functions, garbage collection to name a few things. See here [paulgraham.com] for a list of things that the language pioneered.

That is what I meant by honored history. It turns out that's not only AI. "if" - this is truly striking. Do you have any reference about this? http://en.wikipedia.org/wiki/Conditional_(programming)#If-then.28-else.29 [wikipedia.org] does not provide much insight on the history. Do you mean using "if" as term used for conditional construct? Searching for "if" does not work very well...

Lisp prehistory [stanford.edu] details its invention of the logical IF expression which conditionally evaluates one side or another depending on an evaluated result. Fortran featured computed gotos, but they were awkward to use by comparison.

Lisp is a fascinating language with honored history in AI, but let me ask you this: is it used now in some important applications? Does modern AI software use Lisp a lot? I am under impression that it is more used in theory than in applications.

Autodesk's AutoCAD [wikipedia.org] relies on AutoLisp [wikipedia.org] for a lot of it's features, and also employs it as a scripting language.

As for AutoCAD being considered an "important application", it is the de facto standard for CAD work in engineering, particularly in civil/structural engineering.

Clojure [clojure.org] is a modern LISP -- I have a former employer using it for real-time analytics work (where its transactional memory model made it easy to scale to very, very parallel machines -- the older version of the software written with traditional lock-based concurrency fell down at a fraction of the production load we needed to handle with most CPU cores sitting around waiting for locks.

The biggest thing that interests me, though -- programming in a LISP lends itself to what Rich Hickey calls "hammock-driven

Even if you were the creator of BASIC itself, you're in no danger of an imminent natural death. However, you may still want to go into hiding. I hear there are a lot of angry former BASIC programmers out there.

Probably because they were all young and fit when the first programming languages were (invented|discovered|[A-Za-z0-9]+) in the 1950s. That was a long time ago, and people do not live forever. Also, if you are wondering why so many pioneers are dying, it is because the field was new back then; soon programming languages researchers and [A-Za-z0-9]+s will be dying left and right but nobody will care, because the most well-known work happened decades ago.

I finally decided to buy an iPad and Steve Jobs dies.I started a new project using C and Dennis Ritchie kicks the bucket.Then I started Stanford's AI Course and now John McCarthy is pining for the fjords.

That's it. It's definitive. I'm a God of Death, so I shall use my recently discovered powers for the good of humanity. I'm going out to buy an Oracle DB and learn how to use it. See you on Larry Ellison's funeral next week.

PS: Also, I suspect I'm the God of Rain too, since every time I wash my car it rains the next day.

Nothing will reduce your life expectancy more than doing template metaprogramming in c++.LISP is the king of all computer languages. Its influence is still being felt 50 years after its creation, and people are rediscovering features that good ol' lisp has had since the begining.

Calm down. It took overnight for the news of Ritchie's death to make the front page too. If anything it needs to be verified first. Long time Slashdot readers may remember some of the hoaxes over people's deaths that made it to the front page immediately, but then had to be retracted. Jamie W. Zawinski dying in a motorcycle crash in the late 90s was one of them. If you want immediate unverified news, use social media.

Techcruch posted a story based on a single tweet that was linked from Hacker News. It was closer to eight p.m. EDT before there was solid evidence that it was, indeed, quite true. At that point... it seemed more respectful to hold off until the morning rather than posting immediately.

I was serious although I might've worded it somewhat provocative. The DART example is impressive, the SHINE also. I don't however find anything detailing why LISP was chosen for any of the projects, was it because LISP have some inherent advantage for the specific applications, or because the projects were started during a time when everyone was told that LISP was the holy grail of programming and it was just the obvious thing to do?

I don't however find anything detailing why LISP was chosen for any of the projects, was it because LISP have some inherent advantage for the specific applications, or because the projects were started during a time when everyone was told that LISP was the holy grail of programming and it was just the obvious thing to do?

Well, there are a few reasons that come to mind:

AI projects often involve AI researchers, and Lisp was a popular language in AI research.

Lisp macros have the full power of the language -- this allows programmers to do a lot of interesting things with macros, like extending the syntax and lexicon of the language to meet arbitrary needs.

Until recently, Lisp had features that were very rare in more commonly taught programming languages: lambda expressions, lexical closures, multiple dispatch, etc. This is

I know zip about other projects, but I was hacking on Maxima for use in my robotics assignments and something is to be said for conciseness of Lisp's way of dealing with data structures. Something more is to be said for macros: the programmatic generation of code (they are nothing like C macros). Of course you can generate code in C, but it's a shitty experience, and you have to roll it all yourself. The C/C++ languages do not come with any sort of a data structure to express themselves. Even Python has an ast module. I've found that programmatic generation of code is a big win in embedded world, especially on small microcontrollers (RAM in single kilobytes, etc). Most platform libraries become quite bloated if you want to truly fully support all peripherals, even if a typical application only uses a small subset of the functionality. The compilers are usually too stupid to properly optimize it, even if a fairly rudimentary constant propagation would indicate that 90% of the library is dead code. With macros you can easily generate just the code you need. Macros can easily and cleanly replace external tools like lexer and parser generators. They are also great for implementing extra language features. You don't need hacks like Duff's device [wikipedia.org] or coroutine horkage [greenend.org.uk]. LISP is powerful enough that you can have features like yield implemented in a library [cliki.net].

In the end, it's all about ease of use. Even though I do a lot in C and C++, I detest their verbosity. I mean, come on, ML family had type inference for three decades! Heck, I have worked with a structured basic running on CP/M Z80 that had rudimentary type inference (although didn't have algebraic types). You didn't have to assign types to your variables, and if you tried adding an integer to a string it would balk -- not at runtime, but before it'd accept the new or modified line of program! Variables were assigned types at first use, and if you had a function returning a value (yes, it had functions, but sadly no tuples), it knew what type it'd be based on the code inside of the function. That was in late 80s! Then you come to C++ and get to experience template metaprogramming -- sure it's powerful, but it feels about as expressive as programming a Turing machine directly. And metaprograms are interpreted by the compiler, in a very inefficient way.

LISP however is a nightmarish construct made to entertain academics with academic constructs, which it may do exceedingly well, but for practical real-world applications the usefulness of LISP is long gone if it ever existed beyond a rudimentary level.

I steadfastly held the same view that only academic wienies had any use for Lisp or even worse Scheme. It took me some 15 years to see the light but now I work exclusive in Lisp and Scheme.

Languages are NOT created equal and the challenges we face now needs more powerful languages. That is where Lisp and Scheme come into their own. I think, I'll look into Haskell next. Another language that I previously wouldn't touch with a ten-foot pole.

All turing-complete languages are equally capable, eh? You can create abominations and masterpieces in nearly any language.

Programmers, however, tend to work best in a language that suits their unique preferences and abilities. No language is inherently be