I posted not too long ago with code for a timer (that counts to 25s) which was broken, I fixed the issues, but now the JVM reports it taking longer (1 second longer) than it should run for, although the code prints out the time taken, which reports that it works fine, here's the code:

I have read your code and if I'm correct, every second gameLength is subtracted by one and then sent to System.out.println().The gameLength has been outputted 26 times which means that is has taken 26 seconds to complete. Why isn't gameLength starting at 25 like it should?

I ran your code with 25 and it takes 26.I ran your code with 24 and it takes 25.

Copying the code to Eclipse was a little messy as it did not cut-and-paste into my system cleanly. I moved a few things around a bit but I think I did so accurately, didn't change the way the code functions. You can check my listing and output.

I just wanted to add, (and the previous post was getting rather long) that if there is some sort of timing creep, I think it must be happening with the setting of the variable oneClock.

Potentially, each time you set it, it might have slipped as much as 10msec (timer increment) plus any residual time that the timer might consume. I've seen some places where a timer is set to X but the actual value comes out to X+4msec, give or take a little. Also, the time taken to execute the loop will be part of the slip.

So let's see, 25 x 15 msecs gives 375 msecs, yes? That's not quite enough for a full second's slippage, but maybe that is a good start in trying to find it. (Other comments about rounding & such may also be relevent.)

In any case, I think you should tie your timer tests to the first "beforeTime" rather than "oneClock". Thus, instead of "diff = twoClock - oneClock;" do this sort of thing:

I frequently make posts when I'm in a rush (heavy round-off error). I hope that someone will read my mind and translate what I'm saying...and sometimes it works! For those who wish to test their mind reading skills what I (and Riven) meant was:

Digitial clocks are (effectively) square waves. Their associated counters track how many transistions (a stairstep function WRT time) that have occured (taking into account any overflow) since it was last reset. So when you pretend that a given counter measures realtime, you have to remember that your result has an error bound of twice (+/-1) of the resolution. By dividing first, you effectively divide the clock frequency, so if you divide down to one second resoultion, then the error WRT to realtime is +/- 1 second. That's what I meant by: "tossing way most of the significant information" Using two of these as timestamps compounds the error of the two samples.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org