I was working on a project not too long ago where the original developer was using Timers to time his animations and events firing. It resulted in a pretty horrific view when the app was run on a slower computer. All timings were off, events were not fired, or fired late, even, in some cases, in advance. I did a little bit of research and found this article: http://www.bit-101.com/blog/?p=910.

In this article, Keith Peters demonstrate that there is a misconception with the Timer theory of “a millisecond is a millisecond”. A lot of people rely on this theory to justify the use of Timers at the base of their applications.