Good news! A computer has solved the longstanding Erdős discrepancy problem! Trouble is, we have no idea what it's talking about — because the solution, which is as long as all of Wikipedia's pages combined, is far too voluminous for us puny humans to confirm.

A few years ago, the mathematician Steven Strogatz predicted that it wouldn't be too much longer before computer-assisted solutions to math problems will be beyond human comprehension. Well, we're pretty much there. In this case, it's an answer produced by a computer that was hammering away at the Erdős discrepancy problem.

In the early 1930s, the mathematician Paul Erdős imagined a random, infinite sequence of numbers containing nothing but +1s and -1s. He was interested in knowing the extent to which such sequences might contain internal patterns. One approach to the problem involves cutting the infinite sequence off at a certain point, and then creating finite sub-sequences inside that sequence (e.g. considering only every third number or every fourth). Adding up these numbers in a sub-sequence yields the discrepancy figure. Complicated, yes, but you can go here to learn more.

Our universe appears to be bound by a finite set of laws, yet we often talk about things that go on …
Read more Read more

New Scientist also has an excellent overview of the problem and an account of what happened after two mathematicians set a computer with the task. And no, the computer didn't come up with the answer "42" — but rather a solution that was just as meaningless, but dreadfully long.

Erdős thought that for any infinite sequence, it would always be possible to find a finite sub-sequence summing to a number larger than any you choose - but couldn't prove it.

It is relatively easy to show by hand that any way you arrange 12 pluses and minuses always has a sub-sequence whose sum exceeds 1. That means that anything longer – including any infinite sequence – must also have a discrepancy of 1 or more. But extending this method to showing that higher discrepancies must always exist is tough as the number of possible sub-sequences to test quickly balloons.

Now [Boris] Konev and [Alexei] Lisitsa have used a computer to move things on. They have shown that an infinite sequence will always have a discrepancy larger than 2. In this case the cut-off was a sequence of length 1161, rather than 12. Establishing this took a computer nearly 6 hours and generated a 13-gigabyte file detailing its working.

The pair compare this to the size of Wikipedia, the text of which is a 10-gigabyte download. It is probably the longest proof ever: it dwarfs another famously huge proof, which involves 15,000 pages of calculations.

It would take years to check the computer's working – and extending the method to check for yet higher discrepancies might easily produce proofs that are simply too long to be checked by humans. But that raises an interesting philosophical question, says Lisitsa: can a proof really be accepted if no human reads it?

Interestingly, it may not be necessary for humans to check it. As Gil Kalai of the Hebrew University of Jerusalem, Israel, has noted, if another computer program using a different method comes up with the same result, then the proof is probably right.