There have been various requests over the years for external triggers for frame synchronisation. Generally it's been stated that doing such things isn't trivial, and it wasn't. However, as of the 22nd July firmware, there is support for repurposing the camera LED GPIO to change state on frame start and frame end interrupts.....

I don't quite see how this externally triggers cameras, it appears it just shows you when a picture is being taken. Anyone got any details how it can be used to trigger cameras?

It can't be, and I don't think I've ever said it could be.
You can use it alongside the frame rate control to produce effectively a phase lock loop that will pull things in to sync over a few seconds and keep them there.

There have been various requests over the years for external triggers for frame synchronisation. Generally it's been stated that doing such things isn't trivial, and it wasn't. However, as of the 22nd July firmware, there is support for repurposing the camera LED GPIO to change state on frame start and frame end interrupts.....

I don't quite see how this externally triggers cameras, it appears it just shows you when a picture is being taken. Anyone got any details how it can be used to trigger cameras?

It can't be, and I don't think I've ever said it could be.
You can use it alongside the frame rate control to produce effectively a phase lock loop that will pull things in to sync over a few seconds and keep them there.

Thanks. It looks like I misunderstood "There have been various requests over the years for external triggers for frame synchronisation."

That really is a pity. Do you think there could be a way to get the closest possible to the moment when a frame is taken in raw camera mode?

I can do the same thing as for the main stack (it's less plumbing there too), although it still makes little real world sense with a rolling shutter camera where there isn't a single instant of capture.

Sync does make a big reason for rolling shutter, especially when combined with a special camera mode (raw in this case) which further shortens the image readout etc. (in low resolution, but that is OK for many computer vision applications). There are also moments when the shutter speed is very short and the rolling shutter is negligible (I can think of a sunny outdoor scene etc.).

Hi There,
I'm trying to use the LED flash sync feature in continuous grab mode on a Pi3B with a v2 camera. I followed the instructions from above and was able to get the flash pulse(s) for single grabs but in continuous grab mode (with exposure_mode off) I get no pulses. Should this possible?
---
EDIT
Subsequently, I re-tried disable_camera_led=2 in config.txt & re-compiling an edited dt-blob file to map IO17 to camera-0-LED and it works
perfectly.
---
Thanks again.
Steve.

I'm a little confused here, maybe someone can clear up for me. I also have a V2 camera but there's no LED on V2. How is the signal for the LED captured if it's not on the board?

ds72 wrote:
I'm a little confused here, maybe someone can clear up for me. I also have a V2 camera but there's no LED on V2. How is the signal for the LED captured if it's not on the board?

The LED is not captured, the camera module and the Raspberry Pi's Image Signal Processor(ISP) are communicating. The ISP now uses these information to toggle the previously defined GPIO for the camera LED. Basically when it starts receiving some data, it will say "Yes I'm getting some data lets turn on the GPIO" and when the last piece of data is received it says "This is finished now, I will turn off the GPIO again".

It was just convenient to already have defined one GPIO which is used for the LED on the camera module, which is just re-purposed for the trigger pulses. Just lighting up a LED is not very useful and you need to assign the "camera LED" pin to another GPIO, this is done using the dt-blob. Now an external thing can be connected to this newly configured GPIO and gets informed when the image is captured. This could be for example a motor which will need to move to the next frame of a film roll to digitalize the film frame by frame.

(This is simplified, as we can configure if the ISP turns the led on or off during capture...)

I might have read this thread 100 times and I am still not able to get pulses from the camera. I have an RPI3+ and RPI3v1.2. I have added the following under pins_3bplus and pins_3b2, respectively, under pin_config:

Should type be "external"? Moreover, should disable_camera_led in config.txt be 2 or 3? I would like to get a trigger using both the still and the video port mode.

Currently, I am planning to connect GPIO21 to GPIO20 and trigger an event once the state has changed. Is there a way to get the state of GPIO21 internally and attach a callback without connecting it to another pin?

The instructions were written against dtc version 1.4.0. Later versions add more error checking which is redundant in this case as we're not fully complying with DT as used by Linux.
Add -q to the dtc command line to shut up warnings.

a) I'm not a mind reader and you hadn't provided that previously.
b) You want "vcgencmd version" to report the firmware version as it is a firmware change.
c) Regardless, if you haven't added disable_camera_led then it is correctly defined anyway and should turn your GPIO on/off on camera access.

The first 7 frames get captured at 40fps (preview?), then framerate is 1000/4.984=200.64fps.

I don't know whether it is a bug or feature, but without "-o" option capturing is done at 40fps the whole time.

One question to @6by9:
Length from frame start to frame end is 4.699ms, nearly the whole 4.984ms for 200fps.
Can it be that GPU has 202fps max issue because the 4.699ms do not fit into the whole frame length then anymore?
If I would provide register set for [email protected] mode for v2 camera, could you test whether GPU would work with 360fps as well?

P.S:
Maybe it is easier to use a mode similar to raspiraw 640x240 or 640x480_s tools for GPU test?
Capturing only 240 lines should tale half the 4.699ms time and easily allow for 360fps.

should fix it. I'll check it and create a PR at some point.
40fps is the minimum frame rate that mode 7 is configured to run at.

HermannSW wrote:One question to @6by9:
Length from frame start to frame end is 4.699ms, nearly the whole 4.984ms for 200fps.
Can it be that GPU has 202fps max issue because the 4.699ms do not fit into the whole frame length then anymore?

I haven't measured with a scope as to exactly why we can't exceed 202fps in the firmware. The sensor will obviously have constraints of its own, and IIRC it just stopped sending data when I tried it.

HermannSW wrote:If I would provide register set for [email protected] mode for v2 camera, could you test whether GPU would work with 360fps as well?

P.S:
Maybe it is easier to use a mode similar to raspiraw 640x240 or 640x480_s tools for GPU test?
Capturing only 240 lines should tale half the 4.699ms time and easily allow for 360fps.

I could try a reduced resolution, higher frame rate mode in the firmware. It's not a priority though, although interesting things often get slotted into time when rebuilding kernels etc...

Hi there, first of all thank you very much for the implementation of this feature! This was a thing I whished for some months ago, and now I need it again, so that comes in very handy

One question though. I'm working with raspi Zero W atm. dt-blob source tells me camera LED is GPIO40, and internal, so no need for reconfiguring it, if I got this right.
However, I cant find GPIO40 anywhere in the schematics, neither on the camera connector, not the Pi Board itself. Can you tell me a point where to tap into it?