I've played around with grabbing a motion-jpeg stream from a Canon IP camera (with a small C++ program), running it through ffmpeg to convert to FLV, and send it off to an ffserver. The webcam is then viewed through an embedded flash movie player.

It tends to hang-up after a while. I suspect ffserver as the culprit.

With an MPEG4 source, this might go a bit better. The MPEG4 data stream is much lower bandwidth than motion JPEG. Assuming you're going to present it at 320x240 and the source is 640x480, the conversion should have little if any degradation.

The upcoming MPEG4 support in FLV (not there yet, but I think in beta?) will make this a natural combination. You will still have to do some conversion to get the MPEG4 stream into the FLV container, but there will be little CPU overhead compared to transcoding.

was ffmpeg transcoding "live" to "live" or using some intermediate file?

"live" to "live". Piped from my Ruby (sorry, I said it was C++ earlier) capture program to ffmpeg. ffmpeg has built-in support to send output to ffserver.

Note, however, that ffserver itself does use a temporary file for buffering. You have control over where the file lives through ffserver.conf, though, so you could arrange to store it in a tmpfs (RAM) filesystem.

Of course, this was on Linux - won't work on Windows due to the lack of real piping. The "-i -" parameter tells ffmpeg to take the input stream from STDIN.