I need to make a screen overlay on video composite signal coming from a camera.

All graphics would be made with Pi.

But how to make overlay ?

I can see two posibilites :

1/- use a USB AV converter, add overlay by software and use Video composite from the Pi Board : it certainly consumes a lot of times and produces delay (about ms). Original signal must not be delay (or as little as possible).

2/- Create graphics on Pi, and synchronize (mix) its SVideo with external source video (video composite) : original signal is not delayed at all.

Best solution is the 2nd one. BUT how to synchronize PI SVideo with the cam one ?

So, is there a way (software or hardware) to start video composite frame whenever I want ?

In fact, I only need to surimpose pixels (monochrome is okay, colors would be better) on my video composite signal coming from the camera.

680 pounds is impossible.

I know a few components would be enough (like a dsPIC), but it needs time to develop it although a trick may exist to do it as simply as set a bit in the broadcom processor to tell it when start generating video frame.

You won't be able to achieve this without purchasing some expensive video mixing hardware. Unless you can do something inventive like projecting or reflecting an image onto a shiny CRT screen. I'm thinking about the headup display that BMW uses:

All graphics (text, lines, circles, animation, and so on) would be made by RaspBerry.

Need to mix those graphics with live video coming from the video camera.

Video composite signal is made of frames (look at wikipedia) :

- those made by RaspBerry video controller which are output on RCA Video.

- those output by the video camera.

Problem is that those two systems (Raspberry and camera) start generating their video signal not in same time. They are not synchronized. So Mixing those two video voltages won't produce a correct image.

It needs to synchronize those two video signals, that is make their first video frame start at the same time. When they are, all graphics generated by RaspBerry will be displayed on the live video camera (surimposed).

Or jam the Pi so that it starts each video frame in sync with the external video

The first option requires (a) a method of detecting H & V syncs and (b) jamming the external source. (a) is relatively simple – a couple of level detectors on the composite op, though if the HDMI op is to be used, is it in sync with the composite? (b) may or may not be possible – probably not if it's a domestic camera!

The second merely requires an input to the GPU via a GPIO to tell it when to start a frame. I very much doubt it would be available, without a lot of help from Broadcom. So kneeling in a corner facing towards Cambridge is the best bet

It's not a question of possible or impossible. With an external genlock or video mixer, what you want to do is 100% possible using the Pi to generate video. Unfortunately, external professional quality genlock / video mixer systems cost significant amounts of money.

Another option has been pointed out to you, it's a video mixer shield for the Arduino. Whilst the Arduino is less powerful than the Pi, it has the benefit of being available now, is certainly capable of producing the sort of graphics you want, and is significantly easier to interface to the external world (I assume, given the video you've posted, you want to interface as well). Not to mention that it has a plug 'n' play, ready-to-go solution for doing exactly what you want, and at a price that is more than reasonable.

'Domestic' Vision Mixers (with 2-4 inputs) have been/were available for several years, as well as more 'pro' versions which included framestore synchronisers, and therefore coped with non-genlocked sources: allowing them to mixed or wiped between or keyed over each other.

Separate Framstore-Synchronisers can be used, to add captions over video, and these often include the option to key the computer over the live video: I still have a Vine Micros unit I used with BBC B / Arcs., and alternative VGA to video genlocked converters

[They were probably 'replaced', for domestic users, by the feature appearing in cameras]

(Synchronous sources are a requirement for most purposes, as non-synchronous signals always require matching audio and video delay to be added to bring them in sync ... you might see the ball cross the goal line twice!).

In the BBC Computer literacy series, the BBC Micros were genlocked, to allow them to be included on screen, and of course their display was video-compatible 50Hz (and the Teletext was interlaced!).

On BBC Regional News you can see the use of frame-synchronising mixers when they show the traffic cameras: London Region has now learnt to use a 'cut-away' shot of a transport graphic inbetween live cameras to avoid the synchronising bump which we used to get.

It's not a question of possible or impossible. With an external genlock or video mixer, what you want to do is 100% possible using the Pi to generate video. Unfortunately, external professional quality genlock / video mixer systems cost significant amounts of money.

Another option has been pointed out to you, it's a video mixer shield for the Arduino. Whilst the Arduino is less powerful than the Pi, it has the benefit of being available now, is certainly capable of producing the sort of graphics you want, and is significantly easier to interface to the external world (I assume, given the video you've posted, you want to interface as well). Not to mention that it has a plug 'n' play, ready-to-go solution for doing exactly what you want, and at a price that is more than reasonable.

That gives you 3 choices, pretty much:

1 - get yourself an external video mixer

2 - go with the arduino + video shield solution

3 - design your own video mixer similar to the video shield.

I don't want to buy an expensive mixer, otherwise I wouldn't have asked you.

I asked on the forum because I wanted to know if Raspberry Pi could be synchronized with an external live video signal. I mean : able to start its odd/even frame whenever I want.

I can make an arduino-like, I already did it. It works fine. BUT : process power I need is not enough on that type of processor/µC ! That's WHY I'm interested in RaspberryPi.

One solution is to use Raspberry to make graphics and write "screen objetcs" on a buffer shared with a low cost mixer (Arduino, Microchip, MAX7456, and so on) that mix with the live video.

THAT SOLUTION (which can be done) simply synchronizes those two video signals.

But this "hardware mixing" solution is more complicated than simply synchronize raspberry video signal...

1/ Can it be genlocked sufficiently stably by receiving an external GPI (or 2 GPIs)….

(If not, then the frame-store method is required)

2/ Can the subcarrier-burst (and therefore chroma modulation) EITHER be turned off (allowing grey-scale keying with monochrome timing requirements) OR can the subcarrier be phase-locked to within the required few degrees of subcarrier (at 4.43MHz) in order to be able to mix with the colour contained in the signal.

You say that you do not want to buy 'an expensive mixer'; but that is simply the cost required to do the job properly. Whilst you may perceive the 'colour' of video to be low resolution, to mix it in the encoded domain requires timing accuracy to a few nanoseconds…..

Answer to (1)? Assuming you have created the circuitry to detect vertical and hoirizontal and generate the interrupts:

The GPI's might be able to lock vertical sync and then horizontal to sufficient accuracy for monochrome-timing, to (mono-)genlock the Pi GPU output …. but we are still awaiting public information on the video capabilities in this area – only crude 'TV Standard' and Active Area ('-/overscan') blanking have known options in the config file. [I don't think this is an area where 'computer' people have the same 'interest/priority' as 'video' people - they were not planning on designing a TV station]

Answer to (2)? – Either Monochrome or Colour-locked Output Capability?

However, I foresee NO PROSPECT of being able to control the phase of the subcarrier generated, to within 2-3ns**: this is not being sold as a 'video' product. Also, as a 'caption generator' or slide/movie replay source, the timing delay caused by going through an external frame-delay for synchronisation is not a problem -and there are plenty of these devices available – Eg: see the Datavideo range for a complete range of solutions! **Hence also the desire to be able to switch off colour encoding -for improved mono displays – especially on small portable screens ]

PAL colour coding has taken the baseband chroma information, and inserted it onto a carrier at 4.43MHz (PAL50), and since the PHASE conveys the colour information, the 2 signals must be mixed synchronously to avoid loss of colour …. and that means within 2-3ns (1 cycle, or the whole 360degree colour spectrum in 225ns ). This requires dedicated timing circuitry which is unlikely to be 'available' in the Pi – and we do not know the accuracy of the subcarrier being generated either! (Some Japanese computers or even cameras omitted the 25Hz offset many years ago)

THEREFORE the only option to you is going to be via the 'External Framestore Synchroniser' approach, as others before you, have taken to using. Hopefully you can now understand why this is the case

I don't want to buy an expensive mixer, otherwise I wouldn't have asked you.

But unfortunately for you, the technical requirements are clear ; the framestore solution will be your most cost-effective option – whether this is incorporated within the mixer or externally genlocking its inputs.

'expensive' is simply a subjective valuatio -some
may consider all video to be expensive

And referring back to your 1st post: there is no S-Video output. It is HDMI OR Composite

I'm not sure of its exact capabilities and you would likely have to hack the Chumby netTV further to take in the data needed to overlay and put it where it is required.

Even in its basic form of 'a scrolling overlay at bottom of screen' that could be of interest to people who aren't designing media streamers but would like to put alerts on the TV over what they are watching.