Reading camera clock / more accurate timer?

I'm wondering if there is a way to read the camera's internal clock within a script. I'm trying to make a timer that is adjustable in increments smaller than 1ms. As a feasibility test on my S5is, I wrote a small script that simply had an empty look from 1 to 1,000,000 and then I timed it with a stopwatch.... not very accurate I know, but it seemed to be right around 4 seconds every time, so I'm thinking I could use this for a delay adjustable down to about 4µs

Of course this would only be good on my camera so a better way to do it would be at the beginning of my script to read the camera's clock, count to some big number in my loop, then read the timer again and calculate exactly how long once through the loop takes.. then ask the user to input how many µs they want and then calculate how many loops to run.

As a feasibility test on my S5is, I wrote a small script that simply had an empty look from 1 to 1,000,000 and then I timed it with a stopwatch.... not very accurate I know, but it seemed to be right around 4 seconds every time, so I'm thinking I could use this for a delay adjustable down to about 4µs

Can you post the script? One million iteration - even in Lua - in 4 seconds seems unlikely.

Quote

Of course this would only be good on my camera so a better way to do it would be at the beginning of my script to read the camera's clock, count to some big number in my loop, then read the timer again and calculate exactly how long once through the loop takes.. then ask the user to input how many µs they want and then calculate how many loops to run.has this already been done somewhere??

The internal timer in the camera runs on a 10 mSec interval. You might be able to find some hardware registers that run a lot faster (video sync..) and read those to try for more precision.

Can you post the script? One million iteration - even in Lua - in 4 seconds seems unlikely.

here is what I used for my test:

--[[@title test--]]

print("start")for y=1,1000000 doendprint("stop")

Is there a way to access the 10mS timer?? That would be fine.. I can just have my script calibrate itself when it first runs by running a long loop and see how many 10mS clocks it takes then divide.. the longer my calibrations loop, the more accurate the result will be, even if the time base is only 10mS

Is there a way to access the 10mS timer?? That would be fine.. I can just have my script calibrate itself when it first runs by running a long loop and see how many 10mS clocks it takes then divide.. the longer my calibrations loop, the more accurate the result will be, even if the time base is only 10mS

**************************the first issue is.. when I run this over and over.. it keeps coming up with a different number of tics for my delay loop.. sometimes it's 4300, sometime 4010, sometimes 4600.. I mean it's all over the place. there must be some kind of interrupts happening or something else using the processor during my delay loop and whatever that is, it's different each time it runs. This alone might make my entire project useless.. because I wanted to use CHDK to do waterdrop timing... and if this sort of thing is happening, I'll never get an accurate enough timings to make it work repeatably.

the second issue is.. the math is not being evaluated the way I would like... the following line is the one I have issue with:v=w/u*1000

if W=1000000, U=4170, then V should come out 239808.. but it doesn't... obviously it's rounding each step of the way... and coming up with 239000... doing the 1000000/4170 getting 239.808, but truncating it to 239 then doing the *1000...

I thought I could write it as v=w*1000/u and then it would not do the truncate until later, which would help, but I'm worried about overflowing.. I'm not sure how large of a number I can use here

the first issue is.. when I run this over and over.. it keeps coming up with a different number of tics for my delay loop.. sometimes it's 4300, sometime 4010, sometimes 4600.. I mean it's all over the place.

There are a few issues here:The Lua VM in CHDK is runs inside an OS task called PhySw. In addition to scripting, this task replaces the original task that handles things like key input in the canon firmware. In the original canon firmware, this task sits in a loop reading the keys etc and sleeping for 10ms.

Because this task expects to run every 10ms, the scripting engine in CHDK is set up to yield periodically. By default, in Lua, this is done every 10ms or 2500 VM instructions, whichever comes first. These parameters can be adjusted with the set_yield function: http://chdk.wikia.com/wiki/Script_commands#set_yield

Even accounting for that, there is no reason to expect a delay loop to give you 100% consistent results.DryOS is a fully preemptive multi-tasking OS. VxWorks as configured on the old VxWorks cams isn't exactly preemptive, but it's still multi-tasking. The CPUs have caches, so depending on what else is going on, the same machine code can take varying amounts time to run.

Quote

obviously it's rounding each step of the way... and coming up with 239000... doing the 1000000/4170 getting 239.808, but truncating it to 239 then doing the *1000...

Note if you just want to delay a certain number of milliseconds, you will be far better off using the sleep() function. While this is not guaranteed to sleep for exactly the requested time, it's likely to work a lot better than a delay loop implemented in Lua. It should have a precision of approximately 10ms, although if the system could take longer for control to return.

Thank you for this valuable information! It's making more sense now. I can live with the integers. I was hoping for a resolution of 0.25mS thats why I was trying to write something instead of just using sleep..

I did a quick test and put the lens cap on my camera and re-ran my program as is, and as I nearly expected, with no change in image to process, the results were remarkably more consistent.. presumably it's yielding, having no buttons pushed and no new sensor data so nothing to do and goes back.. so maybe it will be close enough after all.

I don't really need it to be accurate to real time, I just wanted it to be able to adjust it in a fine increment.. so if I set a delay of say 100ms and it really comes out 104.3ms, thats ok, as long as when I ask for 101.25ms I get 105.55ms. so I can probably just use the number I got with my lens cap test and just plug that in permanently instead of calibrating it all the time.

none of my delays will be very long... I was only testing it with long delays to see if the math was working right. I will probably never have any more than 200mS delay.. in fact most of them would me much less, like around 60mS... so I'm wondering.. could I perhaps use set_yield as suggested? I can't really tell what the min/max values are for this.. but could I set it for say 400000 instructions or push the time out to 200mS?

also is there any way to know how long until the next yield, or if just came back from one? that way I could start my entire procedure at just the right time and be done before any yielding happens..

these questions could be irrelevant however... because my intention is to use the motion detector, then light some leds in sequence with specific delays after motion is detected, and I suspect the motion detector will be depending on things happening during the yield such as reading the sensor.

I can also minimize inconsistencies by putting the camera in manual mode, including manual focus, which for my project would work just fine anyway... use the theory that if there is nothing to do when it yields, it will just come right back in a consistent manner

these questions could be irrelevant however... because my intention is to use the motion detector, then light some leds in sequence with specific delays after motion is detected, and I suspect the motion detector will be depending on things happening during the yield such as reading the sensor.

Reading this thread, I'm not really sure what your end goal is? What does this project do when its done?

That said, just as an FYI, there can be a lot of latency in the motion detector part of CHDK. Its a neat hack that basically looks for changes in brighness in the LCD display buffer. Some cameras seem to work better than others but you could have as much as 100mSec of variable delay before any particular motion event is detected. You might want to experiment with that before you go much farther?

My original idea was to have my S5 use the motion detector to detect lightning strikes, but instead of taking the photo itself, have it trigger my DSLR to take the photo.. then I got the idea that if I could do that, perhaps I could use the S5 to detect a water drop falling past, then do a delay, then trigger my dslr to capture the splash with a macro lens. I was thinking that the motion detector must be pretty fast to capture lightning, so maybe it would be fast enough to trigger my water drop thing... I did a test with it and basically moved the zoom all the way out and focused behind the water drop.. then turned on the motion detector, and in this arrangement the water drop pretty much made the entire veiw area blurry as soon as it passed the camera, and I had my script light an led when motion was detected... and this seemed to work, however I didn't really have any way to detect a 100ms delay... just looking to see if the light lit up when the drop fell.

I'm still waiting on a cheap shutter switch I bought just so I could cut off the cord and get the weird plug needed to plug in to my dslr to be able to set up the trigger part of it.. so I was just tinkering with some scripts while I was waiting on that.

if the motion detector has unpredictable delays associated with it, then maybe that idea won't work and I'll have to get some kind of separate timer and a photo detector to detect my falling drop and do the delay... I was just hoping I could do it with my S5 running a chdk script, since I already have the S5 and it normally just sits in it's box. I was thinking that if it was fast enough to capture a lightning strike, it would work.. but now that I think of it.. Lightning strikes aren't really all that fast.. some of them light up the sky for several seconds.. way more than enough time for a few 100ms delays. So while my original idea to capture lightning strikes with my dslr might still work, my premise that if it was fast enough for that, it would be fast enough for other high speed event was probably a bad assumption.

I suppose if I got the timer working I could STILL use the s5 for just a timer by rigging up a light beam drop detector to turn on the usb port.. kind of silly to use a camera in a way that in no way uses the CCD sensor, and would still have the lens cap on... but it would save me $200 on a timer that I only would need once in a while when I get in the mood to take water drop splashes... and why not use just the camera's computer, LCD screen, buttons, and leds in a way that was never intended? It's a Hack!!

endset_yield(old_max_count,old_max_ms) --put yield back the way it was

I put some randomly larger values in for yield, and then I ran the garbage collection cycle, then stopped garbage collection. the results of this are quite amazing! with my previous script, it took about 4 seconds to do my 1million loops, This script does the same 1million loops in 1.2seconds!!! and not only that, if I run it in playback mode on the camera, it takes 1200ms every single time I run it.. running it over 30 times in a row! the previous program never got the same exact number three times in a row.

If I am in record mode, it is slightly slower and it does vary slightly... nowhere near like it was before, where it seemed all over the place... now it seems to be between 1390ms and 1400ms for the 1million test loops.

So maybe the motion detection part of it still wort work, but it looks like the delay is stable enough in playback mode.. so I could perhaps use the external light beam sensor triggering the usb port... we'll have to see. I need that cable to come in so I can trigger my dslr, then I will see if I can get repeatable results