Maybe a better question would be -- WHY do you want to do that?
And where do you plan to get a digitizer that CAN do that?

If you actually have such a thing then black & white rendering would just be a few lines of code. NTSC color would require a FFT and I don't think you could get real time out of that.
Finding color capable source code would be most unlikely.

Normal video capture cards use a PLL to extract color. Then digitize the result at somewhere between 2 to 4 Mhz using 3 channels.

14.31818 MHz is very slow by today's ADC standards. Many ADCs are running at or beyond a gigahertz. 14.31818 MHz is, in fact, a standard sampling rate for NTSC, giving about 4 points per cycle on the color burst, enough to extract reasonably accurate chroma.

An FFT is no problem. I would venture that I can easily get real-time on a GHz+ PC.

I was only wondering if source already existed -- if it does not, I have no problem writing my own.

I ended up writing the code myself in C, based largely on information from the book Video Demystified. It ran very competently on a GHz+ PC, but I ended up reimplementing it on an FPGA, because it became very difficult to get >10 MB/s of data into a PC in the first place. Don't believe for a second that 480 Mbit/s USB 2.0 can really achieve 480 Mbit/s into RAM.

I regret that I cannot give you the source code, as my company owns it, but most of what you'd need is in the Video Demystified book. Note that chroma decoding requires only a digital PLL and a mixer -- no FFT is required.

Thanks for replying to an old thread. What I was originally looking for was a single video frame decoder - no continuous high speed on-the-fly execution required. I have software in place now that reconstructs an interlaced video frame from a simple digitizer and creates a black and white bitmap image (luma only). I know only the basic concepts involved for NTSC and have found numerous places on the net discussing quadrature and chroma . I have actually seen the Video Demystified book but most of it was over my head - being mostly a software guy.

I respect your wish to not release source code (DARN!), but if you could hint at some kind of functional procedure, it would really be greatly appreciated. If not, that is repected also.

If you have the Video Demystified book, everything you could possibly need to understand the NTSC signal is in there. The appropriate digital decoding circuitry (which you could emulate in software) is also right there, laid out in block diagrams.

What are the requirements of your application? Do you need to decode color (chroma)? A black-and-white decoder is probably just a couple days' worth of work, if you understand the signal thoroughly. A color decoder is, in my estimation, about five times as difficult.

As I said, I was successful with deriving a blank and white image from the video signal and have a good understanding of all the H/V line components and their timing and functionality. My next task is to add color to the image. I am familar with implementing digital filters in the frequency domain via FFTs and iFFTs - I was just looking for some insight from someone that has been down this road once before. For example, how should the color burst be processed and how stringent is the need to accurately locate the first sample of the burst and/or the active region of the line? Mainly just practical problems that I should be aware of. I guess I will try and hunt down a copy of the book.

All you really need to do, in the software domain, is fit a sine wave to the color burst. You can use a least-mean-square technique for this. The first sample of the burst is not significant -- you just want to minimize the phase difference between your synthesized sine wave and the color burst. Most implementations discard a portion of the beginning and end of the color burst anyway.