rand() - rand() Likely to be Zero?

This is a weird bit of programming logic that I randomly found in my Artificial Intelligence for Games book:

Apparently, the following function:

123

float randomBinomial() {
return random() - random();
}

... where random() returns a value between 0 and 1, is more likely to return values near 0 than values near -1 or 1. (That is, randomBinomial returns closer to 0.) There are no comments made about the implementation of random()'s generator, it's implied that it holds true given any random generator. Why would it return values closer to 0?