I was wondering what unique features I can learn from Scheme that would help me become a better programmer?

I have a lot experience in mainstream languages, and I am looking to expand my horizons and learn about functional aspects that are missing from other languages. I am familiar with closures from javascript, lambda expressions from C#, and I was wondering what I can focus on that is lacking in other languages? Aside from the Lisp syntax, I feel like what I have seen so far I've already encountered in other languages.

Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself a lot. -- Eric Raymond
–
Robert HarveyDec 29 '10 at 22:54

4 Answers
4

Perhaps the most important defining characteristic of Lisp is "Code as Data." You won't get that experience in quite the same way with any other language. In C#, the closest analogue is expression trees.

It is that quality that makes Lisp an excellent language for parsing. It's also the quality that motivated Paul Graham to say of Lisp: "The unusual thing about Lisp-- in fact, the defining quality of Lisp-- is that it can be written in itself." Although self-hosting compilers are nothing new, no language does it quite as elegantly as Lisp does.

Metaprogramming (something in which Lisp also excels) is also a worthwhile thing to learn.

In computer science and programming, a continuation is an abstract
representation of the control state of a computer program. A
continuation reifies the program control state, i.e. the continuation
is a data structure that represents the computational process at a
given point in the process' execution; the created data structure can
be accessed by the programming language, instead of being hidden in
the runtime environment. It contains information such as the process'
current stack (including all data whose lifetime is within the process
e.g. "local variables"), as well as the process' point in the
computation. An instance of continuation can be later used as a
control structure; upon invocation, it will resume execution from the
control point that it represents. The "current continuation" or
"continuation of the computation step" is the continuation that, from
the perspective of running code, would be derived from the current
point in a program's execution.

In 1963 John McCarthy, the inventor of Lisp, published the paper A
Basis for a Mathematical Theory of Computation in which he proposed
the function (in the computer program sense of the word) amb(.,.). The
idea is that amb(x,y) is first equal to x. But if later in the
computation it is found that this leads to some sort of contradiction
the value of x is retracted and replaced with y. This is a much more
complex business than it may seem to be at first. Retracting a value
essentially means winding back the entire state of the computation to
where it was when amb returned the value x, and then slipping in the
value of y. This means somehow freezing and copying the entire state
when x was first returned. When a contradiction is found the entire
state of the program is discarded and replaced with the frozen version
which is reactivated. These frozen states are known as continuations.
In many ways it's like a GOTO statement on acid. It can cause a jump
to an arbitrary spot on your code. But continuations are nicer than
GOTOs because they are more amenable to logical reasoning.