Animating textures in WebGL

In this demonstration, we build upon the previous example by replacing our static textures with the frames of an mp4 video file that's playing. This is actually pretty easy to do and fun to watch, so let's get started. You can use similar code to use any sort of data (such as a <canvas>) as the source for your textures.

Getting access to the video

The first step is to create the <video> element that we'll use to retrieve the video frames:

First we create a video element. We set it to autoplay, mute the sound, and loop the video. We then set up two events to make sure the video is playing and the time has been updated. We need both of these checks because it will produce an error if you upload a video to WebGL that has no data available yet. Checking for both of these events guarantees there is data available and it's safe to start uploading video to a WebGL texture. In the code above, we confirm whether we got both of those events; if so, we set a global variable, copyVideo, to true to indicate that it's safe to start copying the video to a texture.

And finally, we set the src attribute to start and call play to start loading and playing the video.

Using the video frames as a texture

The next change is to initTexture(), which becomes much simpler, since it no longer needs to load an image file. Instead, all it does is create an empty texture object, put a single pixel in it, and set its filtering for later use:

You've seen this code before. It's nearly identical to the image onload function in the previous example — except when we call texImage2D(), instead of passing an Image object, we pass in the <video> element. WebGL knows how to pull the current frame out and use it as a texture.

Then in main() in place of the call to loadTexture() in the previous example, we call initTexture() followed by setupVideo() .

In the definition of render() if copyVideo is true, then we call updateTexture() each time just before we call the drawScene() function.