Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Do they verify these numbers somehow?
Anyone can write down a series of a numbers and claim it's a specific sequence.

Not saying these numbers aren't correct, just a thought.

Perhaps this is why you should read the article. The press release [bellard.org] answers this question directly.

The binary result was verified with a formula found by the author with the Bailey-Borwein-Plouffe algorithm which directly gives the n'th hexadecimal digits of Pi. With this algorithm, the last 50 hexadecimal digits of the binary result were checked. A checksum modulo a 64 bit prime number done in the last multiplication of the Chudnovsky formula evaluation ensured a negligible probability of error.

The conversion from binary to base 10 was verified with a checksum modulo a 64 bit prime number.

Knowing how to calculate the nth digit of Pi itself is slightly retarded.

The observable universe is about 50 billion light years across, which is about 4.27 * 10^26 meters. If we take a ring of atoms each roughly 1 Angstrom (10^-10 meters) apart with a diameter the size of the observable universe and want to determine the circumference of the resulting circle, then knowing Pi to 40 or so places is sufficient that the error caused by the atoms themselves is greater than that introduced by using an approximate value for Pi. Knowing Pi to 40 or so places is sufficient that you can calculate the difference in circumferences of the inner diameter of the ring and outer diameter of the ring.

Knowing Pi to 40 places is basically sufficient for describing our entire universe and anything you could put into it. We've known the first 35 for four hundred years, and we've never needed that much information to describe our universe.

There is a program package for Linux called Sage math where you can get a lot of digits in your constants. For example, to accurately calculate the circumference of a circular table with diameter=1000 mm you could type:

1000*pi().n(digits=1000000)

All you need now is a decent measuring tape...

These two also work, in case you're worried about not getting good enough accuracy when you calculate Fourier coefficients or something:

(pi().n(digits=1000000))^2(pi().n(digits=1000000))^3

Since Sage sets up a web server on your computer you can even do this inside a decent phone web browser, so you can get that precision out in the field, where you need it.:-)

The implementation (compiled or uncompiled) is in itself an algorithm which can equally be checked because the language follows pre-defined logical rules which may act as axioms or depending on the details of the algorithm it may be trivial to just use induction.

It's not like we're checking a full operating system or office suite here, so size isn't a restrictive problem in such a proof.

It may be that the processor itself hasn't been checked so that the results of executing that algorithm isn't correct either, but again, when it's the algorithm that matters, who cares? We know the specifications of the language which may effectively act as axioms in a proof. The compiler may not be valid certainly, but as long as the algorithm (yes, mathematical and implementation) is correct then that is what matters.

It is down to anyone then using the algorithm to ensure the other layers are correct enough for their purposes.

Hmm, for such a record attempt, do you actually have to calculate all these earlier digits? They're already known. Can anyone prove the computer calculated the already known digits first (instead of getting them from a table) before finally getting to the 120 million new ones?

In 1960 or so the tree-ring data, which before was quite good at correlating with climate data, started to diverge, likely due to acid rain and other air pollution. So it was a known fact that tree ring data past that time was no good at giving climate information, so they replaced it with data that was known to give accurate climate data. Thats all what the "hide the decline" was about.

Or to put it another way: The pollution is already so bad that the data collection gets screwed up and that is now used as an argument that nothing is wrong. Way to go logic...

Verification as actually quite easy, due to the (totally unexpected at the time!) discovery of the Borwein-Bailey-Plouffe [wikipedia.org] algorithm which allows you to directly calculate the N'th hexadecimal digit of pi, without having to determine any other digits.

He used this on a bunch of the last digits and they all came out correct, which makes the probability of an error extremely small.

Last year I used his algorithm to calculate 1e9 (i.e. a US billion) digits of pi and made them searchable:

In any large enough collection of random numbers you will be guaranteed to find whatever pattern you're looking for, whether it's a hundred thousand zeros in a row or the text of the collected works of Shakespeare. You can test statistically how likely you are to find particular patterns in a collection of numbers of a particular size though.