It appears that in the world of programming something comes up once in a few years and is believed by many to be the "silver bullet".

The problem is that we had one. The shift from programming in assembly language to programming with high level languages produced an order-of-magnitude increase in programmer productivity. And since it happened once, there's always the vague hope that it might happen again.

Lets just say that if a new "silver bullet" is on its way now, I sure hope it will be FP - it has marvelous lessons to teach any programmer.

Sorry. We've already been using FP long enough to know that it isn't the next silver bullet. There is a well-defined body of problems which FP handles really well, but there's also a well known body of problems FP doesn't handle so well.

More to the point, FP hasn't lived up to the vision John Backus stated in 1978, of creating a sort of algebra for software.. a world where we could build ever more complicated programs by hooking existing programs together in well-defined ways. There seems to be an upper limit on how much stuff we can integrate before the result becomes too domain specific to be useful for further integration. There's also a problem with each new level of integration leaving us handcuffed to a new form of data representation, which also limits further combination. They're bascially the same problems that kept OOP from being the next silver bullet. There may be a way to crack those barriers, but neither FP nor OOP has done it so far.

Doesn't mean they're useless.. far from it. They just aren't the conceptual nuke that will boost us to the next 10x level of productivity.

I forgot to bring your attention to the fact the Church's thesis certainly doesn't say that "all computer languages are equivlent". A more or less formal defintion would be "any real-world computation can be translated into an equivalent computation involving a Turing machine".

Okay, I admit to doing a hand-wave with regard to Church's thesis. What actually happened is this:

In 1931, Kurt Godel defined the theory of mu-recursive functions. It's an elegant theory that explains how to create new functions by combining existing ones. It uses three basic functions: zero(), next(), and project() (selecting a specific item from an N-tuple), and three operations: function composition, primitive recursion, and mu-recursion. The set of functions that can be produced by applying those operations to the base functions is called 'the set of general recursive functions'. In mathematical terms, this was basically a Theory of Everything.

In 1933, Alonzo Church published the rules for lambda calculus. In '36, he published a paper{1} where he speculated, but couldn't prove, that the set of functions generated by lambda calculus was actually the set of general recursive functions. About the same time, Kleene{2} and Post{3} published papers approaching the same problem from different angles, and Alan Turing came out with a paper{4} that proved the Turing machine was computationally equivalent to lambda calculus.. or actually, 'idempotent', meaning you can define either one in terms of the other. That's the standard mathematical way of showing that two systems are fundamentally the same, despite any apparent differences in their crunchy candy shells.

{1} - with the whizzer of a title, _A note on the Entscheidungsproblem_

{2} - _General recursive functions of natural numbers_

{3} - _Finite combinatory process-formulation_

{4} - _On computable numbers with an application to the Entscheidungsproblem_ -- I think they just liked saying that word.

Church is the one who defined what he called 'effective calculabilty' (which we call 'computability' today) and linked lambda calculus to the mu-recursive functions. Turing was the one who actually linked Turing machines to lambda calculus. His paper served as the template for bringing all the pieces together into a unified theory of computation.

So.. it was wrong of me to say that Church's thesis 'explicitly' links lambda calculus to Turing machines. I might've gotten away with it if I'd said 'Church-Turing thesis', though. "The theory of computability says ..." would definitely have worked.

Ada Lovelace for the palindrome
Albert Einstein for having smelly feet
Alfred Nobel for his contribution to battlefield science
Burkhard Heim for providing the missing link between science and mysticism
Claude Shannnon for riding a unicycle at night at MIT
Donald Knuth for being such a great organist
Edward Teller for being the template for Dr. Strangelove
Edwin Hubble for pretending to be a pipe-smoking English gentleman
Erwin Schrödinger for cruelty to cats
Hedy Lamarr for weaponizing pianos
Hugh Everett for immortality, especially for cats
Isaac Newton for his occult studies
Kikunae Ikeda for discovering the secrets of soy sauce
Larry Wall for his website
Louis Camille Maillard for discovering why steaks taste good
Marie Curie for the shiny stuff
Nikola Tesla for the cool cars
Paul Dirac for speaking one word per hour when socializing
Richard Feynman for his bongo skills
Robert Oppenheimer for his in-depth knowledge of the Bhagavad Gita
Rusi P Taleyarkhan for Cold Fusion
Sigmund Freud for his Ménage ā trois
Theodor W Adorno for his contribution to the reception of jazz
Wilhelm Röntgen for the foundations of body scanners
Yulii Borisovich Khariton for the Tsar Bomba
Other (please explain why)