Google Systems Infrastructure Team has announced that they have been able to sort 1 terabyte of data across one-thousand computers in a mere 68 seconds. (Previously, the same record of 1TB was across 910 computers and took 209 seconds.) For one petabyte of data, it took Google 6 hours and 2 minutes to do the sorting—and the sorted data gets moved to 48,000 hard drives.

Google also acknowledges that they had triple redundancy—the data was backed up on 3 hard drives. Could it be that if you reduced that redundancy, this would all be much quicker? Not necessarily, since storing may not be part of the sorting algorithm that was recorded (and storing the data could have still been ongoing after the sort had been completed).

According to the Google Systems Infrastructure Team during sorting experiments they have followed the rules of a standard terabyte (TB) sort benchmark. The standardized experiments helped understand and compare the benefits of various technologies and also add a competitive spirit. You can think of it as an Olympic event for computations. By pushing the boundaries of these types of programs, one can learn about the limitations of current technologies as well as the lessons useful in designing next generation computing platforms. This, in turn, should help everyone have faster access to higher-quality information.