Any thoughts on how many of these are available? Presumably the Canon f/w uses them. For the regular timers a single 1 tic timer servicing a linked list of callbacks seems probable. But does the HPTimer handler uses a single shared high frequency timer interrupt and another linked list of callbacks? Or do calls to that function get a dedicated h/w timer?

I have no idea. I suspect that using a few should not cause problems...

Quote

Presumably the Canon f/w uses them. For the regular timers a single 1 tic timer servicing a linked list of callbacks seems probable. But does the HPTimer handler uses a single shared high frequency timer interrupt and another linked list of callbacks? Or do calls to that function get a dedicated h/w timer?

I'd expect the former (linked list or some other method). I think the ARM CPU has only one interrupt source (that's actually in use). I also think that the fw has a single 'main' interrupt routine, and it determines the actual interrupts to service.

BTW the types used in my function declarations are not necessarily correct, will need a review before actual inclusion into CHDK.

int init_led_blinking_function(){ _SetHPTimerAfterNow(262144,led_blinking_function,do_nothing,0);// _SetTimerAfter(2000,led_blinking_function,do_nothing,0);}SetTimerAfter seems quite slow. I think it may have been only about 20Hz (EDIT: Exactly 25Hz) or something maximum, I'll have to measure it tomorrow with my o-scope.

SetHPTimerAfterNow works much better, 1kHz 512Hz was no problem, it even worked up to 4kHz 2kHz (with 256 as the delay), it also briefly worked with smaller delays, but then somehow it just stops (leaving the LED in the previous state it was at). I haven't actually measured it's actual speed yet, I just shook the camera around to confirm it was strobing.

If there's anything I can do to optimise the code to make it go even faster we can find out the maximum rate CHDK can flash a camera LED

EDIT: Yes, I did of course forget to divide the frequency by two (two cycles per period); it was quite late (in Aus) when I posted this. I'll have results from the tests coming soon.

This is exciting. The current USB remote script functions return USB state mark / space values with 10 mSec resolution, accurate to about 20 mSec (approx.) at best. Possibly worse with newer cameras that have longer interrupt periods.

The attached patch changes that to 1 mSec resolution. Needs some more testing and is not backwards compatible with current scripts as those scripts expect the USB functions to return values in increments of 10 mSec. As currently implemented, it requires 20 mSec between pulses but that can be fixed.

Not sure how much overhead this add. Changing to 100 uSec would be trivial but the processor load goes up again.

Might want to change the HPtimer interval to 500 uSec to get better precision at the 1 mSec specification.

Just for fun (and in case of an emergency, say if you're stranded on a desert island with nothing but a Canon camera ) here's some code to flash out an SOS over the AF LED. I'd be willing to write a patch file if there's any chance of it being included in the main trunk, somewhere in the miscellaneous menu? (although I would need use some sort of universal LED function instead of just the addresses of course). Currently there's no way to turn it off (how is int CancelHPTimer(hHPTimer); used? Do I just have to give it a pointer to the function I want to cancel?) But anyway in such a situation that would be the least of your worries.

I'll have to redo those tests (of how accurate the pulses were) I was going to upload today, as I forgot which photos corresponded to which camera and settings. But one thing to note, they are quite messy and inaccurate (especially when faster than 1kHz) so some sort of bit-slip preventation may be nescesary.

I'll have to try that patch tomorrow aswell, but by micro-second are we talking about 1/1048576 or 1/1000000 of a second? Not that it's likely to behave that accurately anyway (according to the tests I did today)

I'd be willing to write a patch file if there's any chance of it being included in the main trunk, somewhere in the miscellaneous menu?

I would suggest not spending a lot of time on this in the hope it will get accepted.

Quote

How is int CancelHPTimer(hHPTimer); used? Do I just have to give it a pointer to the function I want to cancel?)

When you call SetHPTimerAfterNow() or SetHPTimerAfterTimeout() they return a "handle". Save this value and use it as the passed parameter to CancelHPTimer(). Although if you simply don't re-enble the timer during the callback it will have the same effect.

Quote

I'll have to redo those tests (of how accurate the pulses were) I was going to upload today, as I forgot which photos corresponded to which camera and settings. But one thing to note, they are quite messy and inaccurate (especially when faster than 1kHz) so some sort of bit-slip preventation may be nescesary.

Don't forget that there are other timer based interrupt routines running. You are working with a sub $100 camera here - not an expensive real time control computer after all.

SetHPTimerAfterNow works much better, 1kHz 512Hz was no problem, it even worked up to 4kHz 2kHz (with 256 as the delay), it also briefly worked with smaller delays, but then somehow it just stops (leaving the LED in the previous state it was at).

Note that the firmware is almost exclusively using the same callback functions as 'good_cb' and 'bad_cb', you may get better results if you follow that practice. Playback mode is recommended for this kind of abuse of the timer system (due to less timers running).

Note that the firmware is almost exclusively using the same callback functions as 'good_cb' and 'bad_cb', you may get better results if you follow that practice. Playback mode is recommended for this kind of abuse of the timer system (due to less timers running).

Yeah that works better, thanks. No more crashes. Here's a 'screenshot' of the output from the LED when the delay was set to 512, in shooting mode (playback mode does indeed work much better though). I think that this would be a reasonable delay time, as the output isn't as messy, and it's fast enough to reliably transfer data at reasonably fast speeds.

OK I'll admit it's actually a photograph, my scope is missing its floppy drive unfortunately. Analogue scope with a long exposure may have worked better I guess. Few more herehere anyway.

For some reason, the photodiode seems way less responsive to the SX40's green AF lamp than on the other cams, perhaps because of the narrow beam focus.

If there's anyone else who has already started some sort of data transfer method using these new firmware delay functions while I was distracted this week playing with my newly acquired S110, the last thing I would want to do would be to create a competing standard that does the same thing. Also, where would be a good place to continue this? Otherwise I'll probably just start a new thread for it.

Nice shots. But in looking at those, I would expect you to have trouble decoding at any speed faster than the one on the right (which seems to be about 500 hz) if you are going to use time based encoding. The others vary so much that within a single bit time the duration of the first bit sometimes overlaps the expected start of the next one.

Quote

For some reason, the photodiode seems way less responsive to the SX40's green AF lamp than on the other cams, perhaps because of the narrow beam focus.

Might also be different sensitivity at different frequencies.

Quote

As for the bits, well would something like this be a good idea?

We are on very well traveled ground here. It strikes me that rather than re-inventing the wheel it might be worth a little googling for "serial protocol" or "serial data encoding", in particular the stuff used for cassette tape interfaces on early 8 bit PC computers. Error detection / correction is going to be more important the faster you try to go.

Edit 1 : a better search term might be "bit banging"

Edit 2 : I think it would be worthwhile to tackle this project "top down" prior to worrying too much about encoding. What are you trying to achieve? For example, how much information (i.e. exact # of bits needed) do you want to transmit from the master camera to the slave? How often to you anticipate needing to send it? How much latency can you tolerate at the slave? Do you anticipate one packet with everything packed into it (plus maybe a short SHOOT command) ? Or is it worth the overhead of a protocol capable of sending any arbitrary data packet? What happens when there is a communications error? As this is not going to be a generic data communications protocol but just a method of passing data from one camera to another, its probably safe to assume both cams will always have the exact same code so "room for expansion" in the protocol is a moot point.

Figured that may be the case aswell, but the when testing with a stock LED it let through just as much current as an orange one. Also, grinding the lens bump off the front didn't help. Looks like I will need an op-amp, but at least I don't need to worry about the opamp's frequency response

Anyway, here's that photo (1s shutter speed) of the analogue scope, with the delay set to 256 microseconds:Not as bad as it looked on the digital scope. But still rather messy. Two more at the previous link

Quote

What are you trying to achieve?

I'm trying to devise an alternate method of synchronising two cameras using nothing but a photo-diode/transistor over the AF LED (or another LED), which is connected to the battery terminal input, and perhaps a resistor or two, and maybe even some sort of amplifier if necessary. It will allow synchronisation of as much as possible between the cameras, without the use of any external microchips. Ideally it would be easy for a beginner to set up, but it would probably require at least using a patch file and changing values (such as LED addresses and adding native functions). I plan write the code with the SX40 in mind, then modify it for my S110, and D20 (with either hall-sensors or hardware modifications) rigs, but hopefully others will be able to also benefit from it aswell without too much effort.

Quote

For example, how much information (i.e. exact # of bits needed) do you want to transmit from the master camera to the slave?

Everything related to the half-press data will fit into a 32bit word, that's 16 bits for the focus, and the rest for exposure, aperture, nd filter, etc. Maybe a bit or two at the front to indicate that the data is for the half-press information.

Quote

How often to you anticipate needing to send it?

Well the above will be sent as soon as the master camera has locked it's focus and exposure. But as the focal-length is changed (and maybe other settings aswell) a different 32b packet (or maybe several packets; speed isn't important for this one) will be sent as it's changed, to get a sort of synchronised zoom. One or two bits will be set aside to indicate that it's merely zoom data and not half-press information. And for the actual synchronised shutter part, a logic high will be sent for a while (at full-press or after the data has been sent, whatever comes last) and then as soon as it goes low again, the photo will be taken.

Quote

How much latency can you tolerate at the slave?

Well the faster the better. Synchronised zoom and settings data-packets don't need to get there instantly, but I would hope that the half-press data can get there within the time it takes for me to realise that the camera has locked focus and I decide to full-press, leaving enough time for the slave to actually set it's focus and begin the shooting process before I release the button to take the photo. So maybe 100ms? I could tolerate more though if necessary.

Quote

Do you anticipate one packet with everything packed into it (plus maybe a short SHOOT command) ? Or is it worth the overhead of a protocol capable of sending any arbitrary data packet?

(above answers were written before I read this) Not sure really, perhaps if I have just the 32 bit standard half-shoot packet, and also another standard longer one to be sent whenever something is changed (probably containing all the data that is changeable, not just the stuff that has been changed).

Quote

What happens when there is a communications error?

My inbuilt speech synthesizer will output one of several preselected 32bit (four letter) words as the slave camera randomly zooms in and crashes