Variance: why squared?

Using the example of a perfect die (from Wikipedia:Variance), the "expected" value is 3.5 (makes sense), expected (average) absolute deviation is 1.5 (okay), and then to get the variance, you square each of the expected deviations and average them.

My question is, what does this "variance" represent, intuitively, and what useful information does the squaring operation generate?

Using the example of a perfect die (from Wikipedia:Variance), the "expected" value is 3.5 (makes sense), expected (average) absolute deviation is 1.5 (okay), and then to get the variance, you square each of the expected deviations and average them.

My question is, what does this "variance" represent, intuitively, and what useful information does the squaring operation generate?

The variance is a measure of spread just like the absolute deviation - typically when using the variance in this way, you take the square root so that it's on the same scale as the absolute deviation. The fact that it is the dominate measure of spread comes from the fact that it is mathematically "nice". In particular, the function $\displaystyle f(x) = x^2$ is differentiable, whereas the function $\displaystyle g(x)=|x|$ is not. That's just one example; there are many others.

As far as what useful information the squaring operation generates: essentially your variance is inflated if you have observations that are particularly far away from the mean, even if some observations are very close. Squaring deviations causes outliers to carry more weight in the measure of spread you come up with. This is often considered desirable in some sense; from the perspective of a statistician, we would often prefer estimators that are close to the thing we are trying to estimate and squaring the error is a way of penalizing estimators that are frequently far away.

Other justifications for being interested in the variance abound. It has a way of showing up in formulas and theorems for things we care about (in particular, the Central Limit Theorem requires finite variance). From the perspective of mathematical statistics and probability it pops up frequently. It would be wrong, though, to say that it's "better" than absolute deviation as a measure of spread. What is appropriate to care about depends on the situation, but keep in mind that the absolute deviation is used very rarely for its own sake and has little practical value, while the opposite is true of the variance and the standard deviation (the square root of the variance).