I'm using FMS 4.5 to record a live stream with H264 encoding. The broadcaster is Flash Player 11. The resulting MP4 file has video but no audio. I have spent a couple of days trying to figure this one out but it simply doesn't work properly. Could it be possible the audio encoders (Nellymoser and Speex) for Microphone object in Flash Player 11 are not compatible with MP4 container?

Here are some of my findings:

1) If I broadcast without H264 encoding, the resulting FLV file is correct and has both audio and video.

2) If I broadcast using H264 encoding, the resulting F4V/MP4 file has video but no audio.

3) If I record an FLV file with H264 encoding, the resulting file has audio but no video.

I understand that FLV container doesn't allow H264 encoding. But why does the MP4 container outputs with no audio after F4VPP post-processing?

Is there any other alternative solution to this problem? I'll really appreciate some help with this one.

Could you please post/send a private message with the piece of code you're using for recording and maybe we could help debug the issue, because I checked on my end and the recording and playback of H264-Speex/NellyMoser in MP4 container is happening just fine for me.

For the server-side, I'm using the default main.asc that comes with livepkgr (except that I have changed f4f: to mp4:) and im publishing the stream to EVENTID?adbe-live-event=liveevent&adbe-record-mode=record.

Can you tell me if you used the same settings to broadcast from flash player in your test? Did you play your mp4 back in a flash player with or without f4vpp post-processing?

Am I playing it the wrong way? If there is a way to directly play those mp4/f4v files, it'll save me (and so many other people) a lot of confusion. Can you tell me how you were playing the generated file?

I'm sorry for the delay in reply. It took me some time to investigate the possibilites for you.

If you want to playback a live stream via http protocol, then you need to record the streams in f4f format. If you require to record the live stream in an mp4 container then I suggest you use rtmp playback.

2. If recording to an mp4 container is your priority then, I suggest rtmp playback instead of http.

3. If you need both, i.e, store your live content and playback live via HTTP then I see 2 possibilities :

a. Use MPP (multi-point publish). So basically publish a live stream in f4f to one application and have that publish the live stream into mp4 container to another application. Although this leads to content duplication, you can playback the stream via http from the first application and also have your mp4 stream in the second application.

b. Else you can use the DVR functionality. By default the live stream is stored by the livepkgr on disk, by using DVR, the client can seek back and forth in the livestream and view from the start. However, the f4f fragments will be saved and not an mp4 file.

If you let me know what is the exact use-case I might be able to help you out better.

Thanks for the detailed response and investigation Apurva. My use case is that I want to broadcast a live stream which is played using HTTP stream live and then later played from the generated MP4 file. So MPP is a great idea and I'll definitely use that to generate an mp4 and live stream with f4f.

But my main question still holds.

1) Whether I use MP4 or F4F, can either of these be played in a flash player directly without post-processing?

2) I tried post-processing a .f4f segment using f4vpp. The video plays back correctly, but there is no audio. I read somewhere on internet that this is a bug with f4vpp tool perhaps?

Apurva, Thanks again for all the details in your response. It's very helpful.

It works for me fine too if I use FMLE, but not when I use Flash Player 11 to broadcast. FMLE supports AAC and MP3 audio format while broadcasting but Flash Player 11 I believe is just limited to NELLYMOSER or SPEEX?