The aliens are vague about the location of the message (it might be in pi) so the Foster character runs software to search for it. Right at the end of the book her program finds a pattern (A circle drawn in 1s and 0s in an 11 by 11 matrix). This pulls together the thread in the book about belief in god vs religion. It turns out that somebody made the universe after all, and the Christians had been (sort of) right all along, though the scientists were right to demand evidence.

I love both the book and film. Thats unusual for me. The Postman was a fantastic book. Don't get me started on the movie.

I often put the DVD of Contact on just to watch the sequence where Fosters character first hears the signal and her crew reconfigure the telescope to analyse it. Its a classic tech scene.

If you want to prove that all the digits are correct, you only have to check a few things:

1. There is a sound mathematical proof that the algorithm used in fact does generate the digits of pi, and
2. The algorithm was coded correctly. This should be even easier to check, though likely more tedious.

Now, what it's good for is a little harder. There is no physical application for such a highly accurate value of pi (39 digits should be sufficient to calculate the circumference of the known universe given its radius to within the diameter of a hydrogen atom). However, large numbers of digits of pi are useful as arguments in number theory, statistics, and information theory. For instance, there is no real proof that pi is a normal number [wikipedia.org], but as more digits of pi are found and the statistical properties of the digits are analyzed and shown to be consistent with the definition of normal numbers, that makes the conjecture that pi is actually normal a little closer to being true (see experimental mathematics [wikipedia.org]).

But don't we have algorithms which let us calculate pi to an arbitrary number of digits?

Yes, we do. Mathematical algorithms, i.e., equations on paper.

Well-known series methods computed using algorithms which have been tuned and re-tuned to the point where it's not really possible to make further major computational optimizations?

Absolutely not. The algorithms have to run on practical, exists-on-the-Earth-today computers. Try to multiply two, million-digit numbers together on your laptop and you'll see what I mean. These achievements are all about computational optimizations. RTFA -- especially the sections entitled "Arithmetic Algorithms" and "Maximizing Scalability." Even the algorithm used for multiplication changes (dynamically!) during the program's execution, based on the size of the operands.

Therefore this isn't so much a new accomplishment as it is "hey look, I left my pi calculating program running longer than the last guy" modified by the occasional minor optimization tweak and running on faster hardware?

Not even close. The computations are so long, and so intense, that errors caused by hardware imperfections can be expected, so error detection and correction algorithms have to be added. If "I left my pi calculating program running longer than the last guy" it would not produce the correct result -- even if the data structures and algorithms it used were up to the task.

But is it really, really something that's newsworthy?

In a word, yes. Could you do it? It's a very, very difficult technical feat, one that required hardware powers and software abilities far beyond those of mortal men. Besides, you're worried about newsworthiness when the two previous/. articles are on wall-climbing robots and the popularity of video game arcades in New York?

And if hypothetical "needing pi to 5 trillion digits" guy needed it to that precision that badly - wouldn't he have already let the calculation run long enough to get it already if this particular calculation only took 90 days?

This isn't about needing pi to 5 trillion digits. This is about learning how to do large computations faster. Like, improving the state of the art.