3D Programming for the Web

WebGL Lesson 16 – rendering to textures

Welcome to my number sixteen in my series of WebGL tutorials! In it, we’ll get started with an extremely useful technique: rendering a 3D scene to a texture, which we can then later use as an input for rendering a different scene. This is a neat trick not just because it makes it possible to have scenes within scenes, as in the demo page for this tutorial, but also because it is the foundation required for adding picking (selection of 3D objects with the mouse), shadows, reflections, and many other 3D effects.

Here’s what the lesson looks like when run on a browser that supports WebGL:

Click here and you’ll see the live WebGL version, if you’ve got a browser that supports it; here’s how to get one if you don’t. You’ll see a model of a white laptop, with all of the various lighting effects you’ll have seen in the previous lessons (including a specular gleam on its screen). But, more interestingly, on the screen of the laptop you’ll see another 3D scene being displayed — the orbiting moon and crate that made up the demo for lesson 13. I’m sure it’s clear that what’s happening in this page is that we’re rendering the scene from lesson 13 to a texture, and then using that texture on the screen of the laptop.

So, how does it work? Read on to find out.

Before we wade into the code, the usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the new stuff. The lesson is based on lessons 13 and 14, so you should make sure that you understand those.

There may be bugs and misconceptions in this tutorial. However, thanks the kind help of Marco Di Benedetto, the creator of SpiderGL, and Paul Brunt of GLGE fame, and a legion of testers, particularly Stephen White, this tutorial’s more correct than it would otherwise have been. Of course, any errors are entirely my own fault, so please don’t hesitate to let me know what I got wrong

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there.

Once you have a copy of the code, load up index.html in a text editor and have a look. This tutorial’s file has quite a few changes from previous lessons, so let’s start at the bottom and work our way up. Firstly, webGLStart; I’ve highlighted new stuff in red, as usual:

So, we’re doing our usual setup, initialising WebGL, loading our shaders, creating buffers of vertices to draw, loading the textures we’ll use (the moon and the crate), and kicking off a request to load the JSON model of the laptop, just like we did to load the teapot model in lesson 14. The exciting new bit is that we’re creating a framebuffer for the texture. Before I show you the code that does this, let’s look at what a framebuffer is.

When you render something with WebGL, you obviously need some kind of memory on the graphics card to receive the results of the rendering. You have really fine-grained control over what kind of memory is allocated for this. You need, at the very least, space to store the colours of the various pixels that make up the results of your rendering; it’s also pretty important (though occasionally not essential) to have a depth buffer, so that your rendering can take account of how close objects in the scene hide distant objects (as discussed in lesson 8), so that needs a bit of memory too. And there are other kinds of buffers that can also be useful, like a stencil buffer — which is something we’ll take a look at in a future lesson.

A framebuffer is a thing to which you can render a scene, and it’s made up of these various bits of memory. There’s a “default” framebuffer, which is the one we’ve always been rendering to in the past, and is displayed in the web page — but you can create your own framebuffers and render to them instead. In this tutorial, we’ll create a framebuffer and we’ll tell it to use a texture as the bit of memory where it should store the colours when it’s rendering; we’ll also have to allocate it a bit of memory to use for its depth calculations.

So, having explained all that, let’s take a look at some code that does it all. The function is initTextureFramebuffer, and it’s about a third of the way from the top of the file.

Before the function starts, we define some global variables to store the framebuffer to which we’re going to render the the stuff that is to go on the laptop’s screen, and to store the texture that stores the result of the rendering to this framebuffer (which we’ll need to access when we’re drawing the laptop itself). On to the function:

Our first step is to create the framebuffer itself, and, following the normal pattern (as with textures, vertex attribute buffers, and so on) we make it our current one — that is, the one the next function calls will operate on. We also store away the width and height of the scene we’re going to be rendering to it; these attributes aren’t normally part of a framebuffer, I’ve just used the normal JavaScript trick of associating them as new properties because they’ll be needed at later points when we’re doing stuff with the framebuffer. I’ve picked 512×512 pixels as a size — remember, textures have to have widths and heights that are powers of two, and I found that 256×256 was too blocky, while 1024×1024 didn’t make things noticeably better.

Next, we create a texture object, and set up the same parameters as usual:

Normally when we’re creating textures to show images that we’ve loaded into JavaScript, we call gl.texImage2D to bind the two together. Now, of course, there’s no loaded image; what we need to do is call a different version of gl.texImage2D, telling it that we don’t have any image data and we’d just like it to allocate a particular amount of empty space on the graphics card for our texture. Strictly speaking, the last parameter to the function is an array which is to be copied into the freshly-allocated memory as a starting point, and by specifying null we’re telling it that we don’t have anything to copy. (Early versions of Minefield required you to pass an appropriately-sized empty array in for this, but that seems to have been fixed now.)

OK, so we now have an empty texture which can store the colour values for our rendered scene. Next, we create a depth buffer to store the depth information:

What we’ve done here is create a renderbuffer object; this is a generic kind of object that stores some lump of memory that we’re intending to associate with a framebuffer. We bind it — just as with textures, framebuffers, and everything else, WebGL has a current renderbuffer — and then call gl.renderbufferStorage to tell WebGL that the currently-bound renderbuffer needs enough storage for 16-bit depth values across a buffer with the given width and height.

We attach everything to the current framebuffer (remember, we bound our new one to be the current one just after creating it at the top of the function). We tell it that the framebuffer’s space for rendering colours (gl.COLOR_ATTACHMENT0) is our texture, and that the memory it should use for depth information (gl.DEPTH_ATTACHMENT) is the depth buffer we just created.

Now we have all of the memory set up for our framebuffer; WebGL knows what to render to when we’re using it. So now, we tidy up, setting the current texture, renderbuffer, and framebuffer back to their defaults:

…and we’re done. Our framebuffer is properly set up. So now that we’ve got it, how do we use it? The place to start looking is drawScene, near the bottom of the file. Right at the start of the function, before the normal code to set the viewport and clear the canvas, you’ll see something new:

In the light of the description above, it should be pretty obvious what’s happening there — we’re switching away from the default framebuffer, which renders to the canvas in the HTML page, to the render-to-texture framebuffer that we created in initTextureFramebuffer, then we’re calling a function called drawSceneOnLaptopScreen to render the scene that we want displayed on the laptop’s screen (implicitly, rendering it to the RTT framebuffer), and when that’s done, we’re switching back to the default framebuffer. Before moving on with drawScene, it’s worth taking a look at the drawSceneOnLaptopScreen function. I won’t copy it in here, because it’s actually really very simple — it’s just a stripped-down version of the drawScene function from lesson 13! This is because our rendering code until now hasn’t made any assumptions about where it’s rendering to; it’s just rendered it to the current framebuffer. The only changes made for this lesson were the simplifications made possible by removing the movable light source and other things lesson 13 had that weren’t necessary for this tutorial.

So, once those first three lines of drawScene have been executed, we have a frame from lesson 13 rendered to a texture. The remainder of drawScene simply draws the laptop, and uses this texture for its screen. We start off with some normal code to set up the model-view matrix and to rotate the laptop by an amount determined by laptopAngle (which, as in the other tutorials, is updated in a animate function that’s called every time we draw the scene to make the laptop keep rotating):

Next, we pass the graphics card information about the lighting-related parameters of the laptop’s body, which is the first thing we’re going to draw. There’s something new here that’s not directly related to rendering to textures. You may remember that way back in lesson 7, when I described the Phong lighting model, I mentioned that materials had different colours for each kind of light — an ambient colour, a diffuse colour, and a specular colour. At that time, and in all of the lessons since, we’ve been making the simplifying assumption that these colours were always white, or the colour of the texture, depending on whether textures were switched off or on. For reasons we’ll look at in a moment, that’s not quite enough for this tutorial — we’ll need to specify colours in a bit more detail for the laptop screen, and we’ll have to use a new kind of colour, the emissive colour. However, for the laptop’s body, we don’t need to worry too much about this: the material colour parameters are simple, the laptop is just white.

So, what’s the emissive colour? Well, screens on things like laptops don’t just reflect light — they emit it. We want the colour of the screen to be determined by the colour of the texture much more than by the lighting effects. We could do that by changing the uniforms that govern the lighting, to switch off point lighting and bump ambient lighting up to 100% before drawing the screen, and then restoring the old values afterwards, but that would be a bit of a hack — after all, this emissivity of the screen is property of the screen, not the light. In this particular example, we could also do it just by using the ambient lighting, because the ambient light is white-coloured, so setting the screen’s ambient colour to 1.5, 1.5, 1.5 would have the right effect. But if someone then changed the ambient lighting, the screen’s colour would change, which would be odd. After all, if you put your laptop in a red-lit room, the screen doesn’t go red. So we use a new emissive colour uniform, which is handled by the shader using some simple code we’ll come to later.

(A side note: it’s worth remembering that an object’s emissive colour in this sense doesn’t affect any other objects around it — that is, it doesn’t make the object turn into a lighting source and light other things up. It’s just a way of making an object have a colour that is independent of the scene’s lighting.)

The requirement for the emissive colour also explains why we needed to separate the other material colour parameters out for this tutorial; our laptop screen has an emissive colour determined by its texture, but its specular colour should be fixed any unaffected by this — after all, the thing showing on your laptop’s screen doesn’t change the colour of the reflection in it of the window behind you. So that colour is still white.

Almost an anti-climax, isn’t it That was all of the code required to render a scene to a texture, and then to use that texture in another scene.

That’s pretty much it for this tutorial, but let’s just quickly run through the other changes from the previous lessons; there’s a pair of functions called loadLaptop and handleLoadedLaptop to load up the JSON data that makes the laptop; they’re basically the same as the code to load the teapot in lesson 14. There’s also a bit of code at the end of initBuffers to initialise the vertex buffers for the laptop screen; this is a bit ugly and will be improved in a later version of this tutorial (the values should be loaded up from JSON like the laptop data but are currently sitting there in the code).

Finally, there’s the new fragment shader, which needs to handle per-lighting-type material colours as an alternative to the texture colour. All of it should be pretty easy to understand in the light of the earlier shaders; the only thing that’s really new is the emissive lighting, and all that happens with that is that it’s added to the final fragment colour right at the end. Here’s the code:

And that truly is it! In this tutorial, we’ve gone over how to render a scene to a texture and use it in another scene, and on the way touched on material colours and how they work. In the next tutorial, I’ll show how to do something really useful with this: GPU picking, so that you can write 3D scenes that people can interact with by clicking on objects.

Acknowledgments: I needed a lot of help to get this one running, in particular because the first version had bugs that didn’t show up when I ran it on my own laptop. I’d particularly like to thank Marco Di Benedetto, the creator of SpiderGL, and Paul Brunt, of GLGE fame, for telling me what I’d got wrong and how to fix it. But I owe a lot of gratitude to the people who tested version after version of the demo until we finally get one that should work pretty much anywhere — Stephen White (who also made it clear to me that RTT was a necessity for sensible picking, which was what made it the topic for this lesson), Denny (creator of EnergizeGL), blinblin, nameless, Jacob Seidelin, Pyro Technick, ewgl, Peter, Springer, Christofer, Thormme, and titan.

@Alvaro; I think the antialiasing is coming from GL_LINEAR, which is set once. The rebinding to another framebuffer may be wiping out this setting. You could test this by putting the GL_LINEAR call into the render loop?

Re: the STREAM_DRAW vs STATIC_DRAW — you know, I’d never realised that I was using a different constant! The last parameter to the gl.bufferData call is actually a hint to the runtime saying how you expect the values in the buffer to change over time; more details here and here. Given the nature of my demos, I should really be using STATIC_DRAW in every case. I’ll update the tutorials to remove any chance of confusion, thanks for pointing it out!

Great tutorial! I’ve done quite some 3D programming but the problem has always been how to deliver the content to the client-end in a hassle-free way. Now, with WebGL, no more need to require the user to install this and that dll or ActiveX. The only sad thing is that the stubborn Microsoft still refuses to embrace open standards such as WebGL — so IE users would still have to install something… Very sad situation for a developer….

Hi giles,
That’s very a cool demo.
I walk on internet several days and try to convert some model type .3ds .blend .max … to Json Model. The only way, i can find, is using the blend converter using the file WebGLExport.py.
When I view it on browser, it’s not the model I converted. It lose some planes, it don’t have color …
So, Please Help me how to work with model in WebGL. My Final exercise is building a Web site with WebGL and I just have 4 month to do it.
My mail is [email protected]. Please contact with me.

Hi giles,
I have converted an animated knight model with your help. The knight walks throught a maze that is just a tiny program but it really make me happy.

I continued with skybox, a simple effect. It worked well, but I had a trouble with the function mat4.LookAt(). This function work strangely,so I can’t understand how to set up this function.
this is the way I set up the function :

In that function,the eye postion is [0,0,0], look at the point [0,0,-100] (Deep inside the monitor), and UP vector is [0,1,0]. but it didn’t work as I Expected. If you see any mistake, please let me know.
Sorry about my annoy. I’m just a beginer ^^.

Hi all,
When I’m trying to do picking on webgl, I use the framebuffer and read the pixel color to do this. And it thow the (SECURITY_ERR : Dom Eception 18) while the function gl.readpixels run.http://imageshack.us/f/94/readpxielerror.jpg/
I don’t know how to solve this problem.
So help me please !!!

so once you have rendered to the texture, is there a way to save this texture so you don’t have to keep rendering to it on every draw call. So basically you would render the scene on the laptop screen once, then just keep reusing that texture, so you are basically only rendering a screen quad, with the saved RTT.

Currently if I don’t keep rendering the scene every update, the texture is blanked out.

Looks like the issue may have been, I was drawing the RTT first, and the model had not fully loaded, and therefore was not shown on the texture. Once I put the RTT on keypress, it works just fine. Now I only render the scene once and then just render texture to quad for all other frame updates.

Another question I had, was looking at the output of the WebGL Inspector, and since the demo calls gl.generateMipmap(gl.TEXTURE_2D); on every draw call, it is being called several times… Is there a reason the mimpas have to be generated every frame?

With all of the other objects I draw, I only had to create the mipmaps when I loaded them, I did not call it on every draw.

[...] there's a way to render to a texture as well in webgl, please check the super tutorial here, however i'm still struggling with it, so the result of javascript version is less interesting then [...]

[...] WebGL Lesson 16: render-to-texture shows how to render a WebGL scene into a texture that can then be used in another scene — a neat trick in itself, and a useful foundation for other techiques. « Presentation 7 [...]

I want to create post-processing glow effect.
So I need object rendered normally -> first texture.
Then same object rendered only with glow maps -> second texture.
And then combining first and blurred second texture to blend them and display it on some quad as finished pp effect.

[...] does. I’ll assume that you’re familiar with the basics of render-to-texture. If not, read up here. Before this extension was available you could render to a color texture, but the depth component [...]