timer problem

This is a discussion on timer problem within the C Programming forums, part of the General Programming Boards category; I have a hardware board with a motion detector. I can read the detector with the function int pirGet(). It ...

I have a hardware board with a motion detector. I can read the detector with the function int pirGet(). It returns 0 or 1

I want to make a function that returns the status of the last 10 seconds.

If motion is detected the output of the function must remain 1 for 10 seconds.

I want to read the function at any time.. anyone got an idea how to do this ?

timeGetTime() would be your best choice for this. What it does is returns the time in milliseconds from when the application started to the current time. The conversion from milliseconds to seconds would be t = t * 1/1000 where t = time. You can now think of delta time or 'dt' as the time passed or the 'difference' from the last call to timeGetTime() to the current call. So, instincts tell you dt would equal the difference of the 2 calls, so: dt = currTime - lastTime;. For example, its 1:00pm when you called timeGetTime() last, and its 1:01pm now, 1:01-1:00 would be 1 minute - easy right? To put it into code: