On 1/31/2013 5:46 PM, John Doty wrote:
....
>> (RJF WROTE)
>> I wrote a parser for it (in Lisp) some time ago, and in my experience
>>
>> one of the best ways to really learn a language is to "implement" it.
>
>(JD WROTE) That's crazy. Implementing a language teaches you nothing
> about its true strengths and weaknesses. To learn that, you must
>*use* it for real-world problems, not the toy problems of theoretical
> computer science.
You miss the point. Let me try to clarify this. Mathematica consists
of several parts. Among these:
a specialized language and implementation of a graphics package.
an implementation of routines for Solve, Reduce, Eliminate.
an expression "simplifier" e.g. FullSimplify and friends.
an implementation of significance arithmetic (bigfloats).
....
There is also a programming language, which appears to have no separable
name and so it too seems to be called Mathematica. One can imagine
this language stripped of all the application stuff specific to
mathematics. Apparently Stephen Wolfram has thought about how this
could be separately sold, though has refrained, so far as I know,
from actually doing this.
This programming language has features, none of which would be called
theoretical computer science by a computer scientist.
Of these features there are many that the vast majority
of users are either unaware of, or misunderstand. This is
probably typical for programming languages and naive users.
In addition to the mysteries of the math application and its implementation,
some of Mathematica's LANGUAGE features are non-obvious and
the misunderstanding is promoted by overly-simplified "explanations"
in the documentation. In some cases the obvious and plausible
understanding for an expert in programming languages, based on
an understanding of what other languages do, is wrong. Now such an
expert might venture to say that such "features" were bugs and should
be corrected.
But one learns that sometimes putting a hat and a
beard on a bug makes it into a feature.
As an example, compare Module and Block. Do you think you
really understand Hold, Defer, Evaluate, UpSetDelayed? Do
you think that I do, after implementing them? (Actually, I did
not implement Defer, which was introduced in version 5.0)
For purposes of discussion here I would
include in the programming language the surface syntax and internal
representation of programs, the fundamental parts of naming, binding,
function evaluation, matching, and integer arithmetic.
I expect that few programmers, even if they have written pages of code,
understand all the binding rules and evaluation orders for Rules and
Functions and matching and such.
Often it does not matter if you have an unlimited store of different
names, and never use the same name twice. But sometimes it does matter,
and people write to this newsgroup with mysterious code.
>
>
<snip>
RJF reports a bug, in version 8, not version 9.
>>
>> returns False.
>
> (JD writes)
> Discovering and reporting this kind of bug is useful, but you're still in the realm of toy problems.
> Such bugs exist in many useful codes, and are only a minor source of error.
Well, there are 3 ways reading this.
1. You are unlikely to encounter this bug.
or
2. If you encounter this bug, it will bother you only in a
minor way.
or
3. Most programs have bugs and it doesn't matter.
regarding 1. Read about the Therac-25, in which a bad software
design was implemented, It hardly affected anyone who was treated
with that X-ray machine. It only killed 3 people.
regarding 2. Read about how arithmetic failure, specifically
the conversion of a 64-bit floating point quantity to a 16-bit
signed integer caused the crash of an Ariane 5 rocket, losing
$500 million. Fortunately unmanned.
For more examples, see
http://www.ima.umn.edu/~arnold/disasters/disasters.html
regarding 3. Programming languages with mysterious and undocumented
semantics (as well as poor debugging features) are likely to make
validation more difficult than otherwise.
>
> SPICE3 has been around for about a quarter of a century, and has been
>critical to billions of dollars of electronic design efforts.
Nevertheless,
>I found and reported a bug in a SPICE3 variant last week. "All
non-trivial software has bugs".
This does not seem to me to justify either a bad programming language
design or bugs in Mathematica.
SPICE was, incidentally, developed at Berkeley in my department.
>
>> (RJF) If I were using a computer to do something that required correct answers
>> for, say, life safety, like building a bridge, I would follow WRI's
>> advice and not use Mathematica.
>
> (JD) I use Mathematica in the creation of designs for space flight hardware.
>But, of course, I don't *only* use Mathematica. It's most useful for
exploring
>ideas ahead of detailed analysis with more specialized software.
>But in my business counting on unverified calculation, regardless of
the source,
>is asking for trouble.
If you find Mathematica useful, fine. If you were using it to (for
example) generate code for real-time embedded processors for space
flight controllers, I would hope you would be very aware of the Ariane 5
and similar disasters.
There is, however, an underlying issue here. That is, it is somehow OK
for Mathematica to have bugs because -- its result would be
independently verified, when it matters.
Imagine someone doing some speculative computation in Mathematica, and
exploring some actually non-existent physical phenomenon which was
predicted because of numerical errors.
Perhaps you are familiar with such controversies regarding computer
simulations and Reynold's numbers.
This is not my area of expertise, but is perhaps within yours.
I just googled and found
"Computing high-Reynolds-number turbulence: will simulations ever
replace experiments?"
http://torroja.dmt.upm.es/pubs/2003/jimenez_jot_2003.pdf
This is pretty far afield from the original question which I
think was somehow...s Mathematica somehow Lisp-like.... should I learn Lisp...
Anyone can obviously use whatever tools float your boat/rocket ship.
To make a plausible case that you are using an especially
good "programming language" and should/should not learn another
seems to call for some comparative evaluation from people who
have appropriate expertise (in programming languages).
Unless the original poster was
also in the space-flight business, your opinion may not be
what he is looking for. I don't know for sure what his
field is, but I expect ... It's not rocket science :)
RJF