I looked at this thread, and I found a lot of timer routines using QPC but there is no explanation about how to use these routines in the main loop.My game must run at 60FPS, with options for reducing CPU ussage.Can someone post a little example program using high resolution timer routines, and how to use them in the main loop?Thanks in advance.

____

"The unlimited potential has been replaced by the concrete reality of what I programmed today." - Jordan Mechner.

You should not use high resolution timers in main loops. They should be used for high precision timing - in games, mainly profiling. For game logic, use lower precision time functions, like timeGetTime() (returns milliseconds) on win32.

The reason is because most high resolution timing functions have... issues...edit: updated RDTSC problems descriptionRDTSC: reports incorrect results on many computers (mostly dual-core systems), crashes on 486s and earlier, and the units change at run time due to power-management issues(note: dual-core issue can be fixed for dual-core AMDs with a utility downloadable from their website)QPC: slow; yields incorrect results on many computers; win32 onlygettimeofday: changes when system time changed; not sure what underlieing method it uses; many yield errors on some computers

And anyway, sub-millisecond precision for regular user interactions is pointless.

Gimme a few minutes to come up with references. Only some of the problems are documented on the web though. I have personally seen both QPC and RDTSC fail. I've never seen gettimeofday() fail, but that's because I primarily use windows.

Quote:

But it leads to more accurate simulation of physics and other game functions.

Timers don't make physics calculations more precise; they just make physics more closely synchronized with real time. However, there's not point in synchronizing them more accurately with real time than the output devices (or the human perceiving them, assuming you're outputting to a human) can track. For instance, CRTs tend to only do a retrace every 8-20 milliseconds, rendering sub-millisecond synchronization for animations displayed on CRTs pointless. LCDs have different issues, which I'm less familiar with the precise numbers for, but they have multiple sources of timing issues; data rates for the digital cables and update rates for the LCs themselves. Either issue alone is enough to render sub-millisecond timing irrelevant for synchronizing LCD output. The human eye also introduces timing limitations, but these are harder to quantify due to variations depending upon which portion of the eye you are using (peripheral vision etc), individual variation between eyes and brains, and the nature of the visual stimuli. Sound is a different issue of course, but one I know less about. However, it's usually buffered for several milliseconds in advance even in games, and much of the timing there is handled by the sound hardware and drivers.

Basically, I have tested high resolution timers using QPC on windows on many different PC's for my games, as has Mike Welch who I have worked with on Neon Wars. The results are consistently better than lower precision timing results in terms of smoothness and accuracy.

So, I am speaking from a practical and pretty experienced point of view here.

Timers don't make physics calculations more precise; they just make physics more closely synchronized with real time.

If you're using a delta-time method, you are basically using a Riemann sum to compute the position based off the velocity and acceleration. By using smaller deltas (smaller rectangles), your result is more accurate. Imagine: how accurate (*not* how smooth would it appear on screen) would the movements be if you used a 1-second timer?

Oh, you're right. I sorta forgot about the question when orz posted his example of how to do it. Did you just add the example of how to use KittyCat's code, or was that there? Either way: Paul: read this, especially where he says "How to use KC's routines:"

(also, of course, the other reason I forgot to list originally: RDTSC is based upon processor frequency, which changes at run time due to power management features)

personal experience: my dual-core AMD had neither QPC nor RDTSC work correctly originally. Using KB896256 and another microsoft patch I can't find at the moment fixed QPC but did not fix RDTSC (contrary to the claims of the docs for the hotfixes IIRC). Using AMDs utility may have fixed RDTSC, but I haven't tested it enough to tell yet (only installed yesterday).

2nd-hand experience: Shizmoo Games (http://www.shizmoo.com/) told me in aproximately 2001 (my memory is a little fuzzy on the exact date) that they had switched from using QPC to using lower resolution timing functions because a significant numbers of their users had some flaw in their motherboards chipset that caused QPC to return bizarre results but left other timing methods functional.

documentation for dual-core &| multi-CPU issues on the web from major companies:

1. Hardware bug: race condition when updating parts of the register? I have seen speculation to this effect more than once. ([qpc_race]) 2. It jumps forward several hundred ms sometimes under heavy PCI bus load. ([qpc_jump]) 3. The PIT is slow to read - it requires sending a latch command, and 2 8-bit reads, for a total of ~3 µs. 4. When using the TSC, it is documented to return different values depending on which processor is running the code, on some (buggy) systems.

Will update this to handle other posts while I was googling for that stuff

edit:

Quote:

If you're using a delta-time method, you are basically using a Riemann sum to compute the position based off the velocity and acceleration. By using smaller deltas (smaller rectangles), your result is more accurate. Imagine: how accurate (*not* how smooth would it appear on screen) would the movements be if you used a 1-second timer?

You're equating physics synchronization with physics accuracy. If you want more accurate physics on the same time measurement method while using a delta-based time, it's as simple as using two deltas of half the size. Time gets clumped, but physics accuracy is not lost.

I do. I don't know I quite trust the stuff about QPF, but the RDTSC references make sense. Actually, I'm going to do some experimentation when I get home: the original UT used RDTSC in its code, and it runs too fast on many systems I've played it on. I will disable CPU clock scaling and see what I get.

[append]That's the first comment in the virtual dub blog post

Okay, so the best high-frequency counter is timeGetTime and gettimeofday, it seems. Yes, orz?

I use timeGetTime() for game logic, and RDTSC for performance monitoring.

timeGetTime():On all systems I've tested, has a resolution of at least 10 milliseconds; on many (all 2k&XP? not sure) it has a resolution of 1 millisecond. It's accuracy seems generally quite good for its resolution. 10 milliseconds is less than I'd like when use delta-timing, but good enough; 1 millisecond is plenty for delta-timing games. This function was reasonably fast on the systems where I tested it.

RDTSC:Fast. Typically 10-100 cycles, depending upon CPU type. The problems this has render it useless on some platforms, but performance monitoring is non-critical on users systems, so you only have to worry about your own testbeds.

other timing methods:

allegro timer interrupts - I don't like the interface, but they seem to yield enough accuracy for timing game logic. I never set them to run over 200 hertz.

gettimeofday() - unixers seem to like it but docs claim that it's not monotically increasing, so I would avoid it for game logic. I'd like to do more testing on this - I don't know what the real-world speed, resolution, or accuracy is like.

SDL_GetTicks() - haven't checked the implementation, but I would guess it to be a wrapper for timeGetTime() on windows. If you use SDL this looks like a decent choice.

libc time(NULL) - very low resolution, but this is what I use when I can't find anything non-libc

libc clock() - documentation on this varies wildly and is contradictory.

QPC() - too buggy for game logic, too slow for profiling. However, unlike RDTSC its units are easily convertable to seconds, so I still occaisonally use it.

the current time is measured relative to the call
to init_time() for get_time_ms() and get_time_s(), but NOT for get_time_ticks()
*/

volatile int get_time(); //to get the current time in milliseconds
#define get_time_ms get_time

volatile double get_time_s(); //to get the current time in seconds at high precision
#define get_time_seconds get_time_s

volatile Sint64 get_time_ticks(); //to get the current time in unspecified units
volatile double get_time_tick_period();
/*note: The period of the ticks returned by get_time_ticks() may change over time
and the value reported is only an aproximation.
*/

//set this to determine how you want to react to errors
extern void (*get_time_error_function) (char *);

int idle ( int time ); //to yeild a number of milliseconds to the OS
extern int _no_idle; //to disable the above function

gettimeofday:
requires: UNIX (including linux, and many other unix-like systems)
resolution: varies, at best 1 microsecond, at worst ?100 microseconds? (guess)
efficiency: no idea (untested)
other notes: may run backwards due if the system time is adjusted

LIBC:
requires: an implementation of the C standard library
resolution: shitty: 1 second
efficiency: varies
other notes: the method of last resort

Allegro:
requires: Allegro
resolution: typically between 1 and 10 milliseconds
efficiency: extremely fast on a per-call basis, but high overhead to install at all
other notes: none

I tried to use these routines, RIchard helped me a lot, and he answered all my question but I could not make it work properly.I am using devcpp since a few days ago and perhaps there is something wrong with my configuration, so can someone else try to compile this routines?

timer.hpp

1

#ifndef __Timer_hpp__

2

#define __Timer_hpp__

3

4

#include <allegro.h>

5

#include <winalleg.h>

6

7

struct timer

8

{

9

LARGE_INTEGER tstart, tticks, tnow, tlast;

10

int high_freq, logic_frames, logic_fps, gfx_frames, gfx_fps;

11

};

12

13

externstruct timer timer;

14

15

void start_timer(void);

16

void reset_timer(void);

17

int check_timer(int frac_sec);

18

void reset_timer_alt(void);

19

20

#endif

timer.cpp

1

#include <Windows.h> a veces hay que sacvar esto

2

#include "timer.hpp"

3

4

// High resolution timer code for Windows. Call start_timer first and then call

...Well, this loops works OK at 60fps, but when some sprites disappear the FPS grows(the game runs faster), with few sprites my game runs faster and with more sprites it runs slower.What can be wrong?......

Quote:

Paul: read this, especially where he says "How to use KC's routines:"

thread's title said:

I need a high resolution timer with QPC

Quote:

You should not use high resolution timers in main loops

I disagree.

Quote:

You are totally high jacking the thread.

I agree.

...[EDIT]GullRaDriel I'll try that now, thanks!

____

"The unlimited potential has been replaced by the concrete reality of what I programmed today." - Jordan Mechner.