overriding ModuleBase streaming

Hello, I've been having a hard time solving this, so I decided to post here.

What I'm trying to do is basically stream h264+aac content with the addVideoData/addAudioData functions.

I managed so far to implement an IServerNotify2 class that creates a publisher and streams the data correctly, adding the video and audio frames in the publisher.

This works perfectly, but now I'm trying to do the same on a per client basis, in a ModuleBase class as I assume I should.

I manage to take the current stream and add the video/audio data to the stream apparently correct in the play function (meaning that if I grab the current stream from another part of the code and use the 'createSnapshot' sample function, I get back perfectly healthy frames).

Now, before anyone asks, the problem isn't related to the actual video content/headers I pass in, as the same functions are used in the publisher, and that works fine, only difference I see is that in the publisher I have a clientless stream that I get from startup, and I add frames to it through the publisher, and here I'm trying to add frames directly into a client stream I get on 'play'.

From what I understand, the cuepoint example inserts data on the client's request. I'm trying to override the normal streaming i.e. send my own frames in the stream instead of the requested content.

What I wrote above adds the frames to the stream, but blocks the rest of the circuit until done, so the player(LivePlayer,PlaylistPlayer,etc) associated with the stream never sends the actual frames to the client. In comparison, in a normal module, 'play' would get called, do some stuff, finish, and afterwards a player in background handles sending the content to the client. I think I need to implement a callback that would get called when the frames are just about to be sent to the client, and override it there, or tell the stream to use my own receiver instead of, for example, the LiveReceiver class.

I tried the onPlay handler in IMediaStreamActionNotify2, same thing it blocks the stream until I finish.

About the cuepoint example, how can I do it without having the player ask me every time for a packet ?

Reiterating what I've been trying to say: how can I simply add my own video/audio frames to the stream from a ModuleBase class instead of letting him play whatever content the client asked for.

Simplest example: normally, a client (let's assume for simplicity's sake, the live.html example page is used) connects to my server asking for the 'test.mp4' stream. normally, the wowza server starts sending the requested content to the client, and in the mean time, calls my module's functions on big events (onAppStart, onConnect, play, etc.) so I can gather statistics and other stuff.

What I want to do is to have the client connect exactly like he usually does, but instead of letting the wowza server give him his requested content, I want to take over and send the video/audio content myself once my module gets called on play (using something simple like addVideoData/addAudioData to the client's stream)

this works in my tests, but what worries me is if it's accepted, if it doesn't have any side effects I don't see at the moment, and if the api's behavior won't change with wowza upgrades and patches as it's not a documented, publicly accepted usage of the Publisher API.

#2. The other solution I could find, is implement my own MediaReader class, add it to the 'MediaReaders.xml' file, and inside my reader class I should take care of adding my own frames instead of reading them from a file, like MediaReaderH264. The problem here is that I don't have any examples regarding the MediaReader like I had for the Publisher API.

what I need to know if I can rely on the publisher api to work like I described above (a publisher created for each client connected, and stopped at disconnect), or should I write my own MediaReader (for which I would really need an example from you guys)

this works in my tests, but what worries me is if it's accepted, if it doesn't have any side effects I don't see at the moment, and if the api's behavior won't change with wowza upgrades and patches as it's not a documented, publicly accepted usage of the Publisher API.

Hi Excelle,

I am having similar problem like you did several months ago of adding few video frames before a live stream. I see that you have got it working using Publisher API. I tried doing it the way you have. But I am not sure what you have used for Publisher API? Is it the publisher class (wowza server side API doc page #1289) you are referring to? Can you give any hints as to what you have used within FramePublisher above?

I saw the midroll example, and it's not helping... I don't want to send metadata, I want to send the actual video and audio frames. For example, the publisher api specified exactly what header bytes it wanted for video/audio frames.

And you completely overlooked the other part of my above post, regarding the usage of the publisher api, as the examples I've seen create a publisher at the server start-up in a IServerNotify class, not at client play time in a ModuleBase class - for which I was asking if it's intended behavior to allow me to create a publisher for each and every client coming along.