3D Programming for the Web

WebGL Lesson 5 – introducing textures

Welcome to my number five in my series of WebGL tutorials, based on number 6 in the NeHe OpenGL tutorials. This time we’re going to add a texture to a 3D object — that is, we will cover it with an image that we load from a separate file. This is a really useful way to add detail to your 3D scene without having to make the objects you’re drawing incredibly complex. Imagine a stone wall in a maze-type game; you probably don’t want to model each block in the wall as a separate object, so instead you create an image of masonry and cover the wall with it; a whole wall can now be just one object.

Here’s what the lesson looks like when run on a browser that supports WebGL:

The usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on in the code, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the differences between the code for lesson 4 and the new code.

There may be bugs and misconceptions in this tutorial. If you spot anything wrong, let me know in the comments and I’ll correct it ASAP.

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there. Either way, once you have the code, load it up in your favourite text editor and take a look.

The trick to understanding how textures work is that they are a special way of setting the colour of a point on a 3D object. As you will remember from lesson 2, colours are specified by fragment shaders, so what we need to do is load the image and send it over to the fragment shader. The fragment shader also needs to know which bit of the image to use for the fragment it’s working on, so we need to send that information over to it too.

Let’s start off by looking at the code that loads the texture. We call it right at the start of the execution of our page’s JavaScript, in webGLStart at the bottom of the page (new code in red):

So, we’re creating a global variable to hold the texture; obviously in a real-world example you’d have multiple textures and wouldn’t use globals, but we’re keeping things simple for now. We use gl.createTexture to create a texture reference to put into the global, then we create a JavaScript Image object and put it into a a new attribute that we attach to the texture, yet again taking advantage of JavaScript’s willingness to set any field on any object; texture objects don’t have an image field by default, but it’s convenient for us to have one, so we create one. The obvious next step is to get the Image object to load up the actual image it will contain, but before we do that we attach a callback function to it; this will be called when the image has been fully loaded, and so it’s safest to set it first. Once that’s set up, we set the image’s src property, and we’re done. The image will load asynchronously — that is, the code that sets the src of the image will return immediately, and a background thread will load the image from the web server. Once it’s done, our callback gets called, and it calls handleLoadedTexture:

The first thing we do is tell WebGL that our texture is the “current” texture. WebGL texture functions all operate on this “current” texture instead of taking a texture as a parameter, and bindTexture is how we set the current one; it’s similar to the gl.bindBuffer pattern that we’ve looked at before.

Next, we tell WebGL that all images we load into textures need to be flipped vertically. We do this because of a difference in coordinates; for our texture coordinates, we use coordinates that, like the ones you would normally use in mathematics, increase as you move upwards along the vertical axis; this is consistent with the X, Y, Z coordinates we’re using to specify our vertex positions. By contrast, most other computer graphics systems — for example, the GIF format we use for the texture image — use coordinates that increase as you move downwards on the vertical axis. The horizontal axis is the same in both coordinate systems. This difference on the vertical axis means that from the WebGL perspective, the GIF image we’re using for our texture is already flipped vertically, and we need to “unflip” it. (Thanks to Ilmari Heikkinen for clarifying that in the comments.)

The next step is to upload our freshly-loaded image to the texture’s space in the graphics card using texImage2D. The parameters are, in order, what kind of image we’re using, the level of detail (which is something we’ll look at in a later lesson), the format in which we want it to be stored on the graphics card (repeated twice for reasons we’ll also look at later), the size of each “channel” of the image (that is, the datatype used to store red, green, or blue), and finally the image itself.

On to the next two lines: these specify special scaling parameters for the texture. The first tells WebGL what to do when the texture is filling up a large amount of the screen relative to the image size; in other words, it gives it hints on how to scale it up. The second is the equivalent hint for how to scale it down. There are various kinds of scaling hints you can specify; NEAREST is the least attractive of these, as it just says you should use the original image as-is, which means that it will look very blocky when close-up. It has the advantage, however, of being really fast, even on slow machines. In the next lesson we’ll look at using different scaling hints, so you can compare the performance and appearance of each.

Once this is done, we set the current texture to null; this is not strictly necessary, but is good practice; a kind of tidying up after yourself.

So, that’s all the code required to load the texture. Next, let’s move on to initBuffers. This has, of course, lost all of the code relating to the pyramid that we had in lesson 4 but have now removed, but a more interesting change is the replacement of the cube’s vertex colour buffer with a new one — the texture coordinate buffer. It looks like this:

You should be pretty comfortable with this kind of code now, and see that all we’re doing is specifying a new per-vertex attribute in an array buffer, and that this attribute has two values per vertex. What these texture coordinates specify is where, in cartesian x, y coordinates, the vertex lies in the texture. For the purposes of these coordinates, we treat the texture as being 1.0 wide by 1.0 high, so (0, 0) is at the bottom left, (1, 1) the top right. The conversion from this to the real resolution of the texture image is handled for us by WebGL.

That’s the only change in initBuffers, so let’s move on to drawScene. The most interesting changes in this function are, of course, the ones that make it use the texture. However, before we go through these, there are a number of changes related to really simple stuff like the removal of the pyramid and the fact that the cube is now spinning around in a different way. I won’t describe these in detail, as they should be pretty easy to work out; they’re highlighted in red in this snippet from the top of the drawScene function:

There are also matching changes in the animate function to update xRot, yRot and zRot, which I won’t go over.

So, with those out of the way, let’s look at the texture code. In initBuffers we set up a buffer containing the texture coordinates, so here we need to bind it to the appropriate attribute so that the shaders can see it:

What’s happening here is somewhat complex. WebGL can deal with up to 32 textures during any given call to functions like gl.drawElements, and they’re numbered from TEXTURE0 to TEXTURE31. What we’re doing is saying in the first two lines that texture zero is the one we loaded earlier, and then in the third line we’re passing the value zero up to a shader uniform (which, like the other uniforms that we use for the matrices, we extract from the shader program in initShaders); this tells the shader that we’re using texture zero. We’ll see how that’s used later.

Anyway, once those three lines are executed, we’re ready to go, so we just use the same code as before to draw the triangles that make up the cube.

The only remaining new code to explain is the changes to the shaders. Let’s look at the vertex shader first:

This is very similar to the colour-related stuff we put into our vertex shader in lesson 2; all we’re doing is accepting the texture coordinates (again, instead of the colour) as a per-vertex attribute, and passing it straight out in a varying variable.

Once this has been called for each vertex, WebGL will work out values for the fragments (which, remember, are basically just pixels) between vertices by using linear interpolation between the vertices — just as it did with the colours in lesson 2. So, a fragment half-way between vertices with texture coordinates (1, 0) and (0, 0) will get the texture coordinates (0.5, 0), and one halfway between (0, 0) and (1, 1) will get (0.5, 0.5). Next stop, the fragment shader:

So, we pick up the interpolated texture coordinates, and we have a variable of type sampler, which is the shader’s way of representing the texture. In drawScene, our texture was bound to gl.TEXTURE0, and the uniform uSampler was set to the value zero, so this sampler represents our texture. All the shader does is use the function texture2D to get the appropriate colour from the texture using the coordinates. Textures traditionally use s and t for their coordinates rather than x and y, and the shader language supports these as aliases; we could just as easily used vTextureCoord.x and vTextureCoord.y.

Once we have the colour for the fragment, we’re done! We have a textured object on the screen.

So, that’s it for this time. Now you know all there is to learn from this lesson: how to add textures to 3D objects in WebGL by loading an image, telling WebGL to use it for a texture, giving your object texture coordinates, and using the coordinates and the texture in the shaders.

If you have any questions, comments, or corrections, please do leave a comment below!

Otherwise, check out the next lesson, in which I show how you can get basic key-based input into the JavaScript that animates your 3D scene, so that we can start making it interact with the person viewing the web page. We’ll use that to allow the viewer to change the spin of the cube, to zoom in and out, and to adjust the hints given to WebGL to control the scaling of textures.

Acknowledgments: Chris Marrin’s spinning box was a great help when writing this, as was an extension of that demo by Jacob Seidelin. As always, I’m deeply in debt to NeHe for his OpenGL tutorial for the script for this lesson.

142 Responses to “WebGL Lesson 5 – introducing textures”

I have created a WebGL based Framework which allows user to easily create object parts that are joined to each other and can be moved and rotated with respect to each other.

The framework is designed to be easy to use by hiding all lower level WebGL implementation. Don’t know what a WebGL buffer is or what shaders or binding means? No problem. When you use the JOA Framework you don’t need to know about any of that because the JOA Framework will handle all that for you.

All you need is basic knowledge of Vertices, Texture Coordinates and Indices. Since you are on Lesson 4 of the WebGL Tutorial, I’m guessing you do. If not go through the first 5 Tutorial lessons first.

The JOA Framework implements a object structure that allows objects to be related to each other (such as the upper arm is connected to the lower arm), allows the user to define the 3D appearance of the objects, allows the objects to be textured, allows the objects to be moved and rotated while keeping their connections to their parents and children and even allows objects to be cloned so that you can create multiple objects from one object definition.

A sample project of a bendable human body is provided.

Instructions are also provided how to extract data from a freely available program called Anim8tor so that the 3D objects can be created using a program GUI as opposed to hand-coding.

My JOA Framework, available for free, is downloadable from Media Fire at the following link:

FYI:
If you wish to save this code and run it locally(to play around with it), the texture image will not load because it is considered a cross domain image in this context. To fix this you need to add the line:
neheTexture.image.crossOrigin = “anonymous”;
just below:
neheTexture.image = new Image();

To I5:
I was having the same problem. Kept getting:
Uncaught Error: SECURITY_ERR: DOM Exception 18
Turns out it’s somehow a security risk to load files locally, even though you’re running your JavaScript in the same directory.
I found two ways to test locally, but here’s the one I went with. You run Chrome from the command line with the tag “–allow-file-access-from-files”. You have to make sure you close all Chrome windows before you do that or else it won’t work.

I want to play with different images, simply by loading different image files, but most of them don’t display (see only black background). What have I done wrong? any restrictions on the image size, style, format, etc.?

I see this has already been mentioned, but I think it’s worth emphasizing the fact that the texture size MUST BE A POWER OF 2! This used to be the case in OpenGL 2.0 and earlier, but it is still the case in WebGL.

Hello all. Upon completing this tutorial, my object is being rendered with colour, except the colour is uniform. Basically, the whole thing is exactly the same shade of green as opposed to having varying colours based on my texture.

I thought that this may be a lighting issue (since I have not added any) in that everything defaults to the same shade; but the more I think about that, the less likely it seems.

The other option is that my textureCoords (i.e. 0.0 to 1.0) aren’t being set correctly, but I’ve done some digger there and it does seem to be setting correctly. In that I mean: my bottom-left vertex as UV (0, 0), my top-right vertex has UV (1, 1) and the inbetweeners have some range between that based on their position.

First I want to say that your tutorials are awesome. I already knew a lot of Open GL when I started them, but I feel like you have really deepened my understanding.

However, when downloading the source code and running this lesson from my computer I am only seeing a black box where the canvas is. All previous 4 tutorials worked so I believe it is something to do with the loading of the texture. I downloaded your nehe.gif to my local directory and it seems as if it’s being used but somehow not loading correctly. I’m not sure if this is the right approach or what I should try but any help would be really great.

I am trying to make a texture scroll down, the Y-axis, on a 2D square via the shaders. My plan is to simply update the uv coordinates to make the texture appear to move instead of translating the piece of geometry with the texture on it. I can get the texture to move, but it never updates. Can anyone please give me an example of how this can be accomplished?

I spend the longest time comparing code trying to figure out why the texture wasn’t rendering. Turns out my camera was pulled back farther than in the example and as a result caused problems with the scaling used in the example. After digging around I ended up generating a mipmap and all worked.

I would like to texture the pyramid from the previous lessons. Does texturing pyramids much differ from texturing cubes? I’d like to know the size of the texture and the position of vertices. Appreciate examples.

Using the switch –allow-file-access-from-files I can get the texture to load locally but it is scaled incorrectly and not rendered onto a cube face (just flat across the canvas). I have the RENDER WARNING: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not ‘texture complete’

At the moment I’m running the code from my glassfish server on localhost which seems to get things working, although the render warning is still displayed so maybe something else is happening?

Hello,
Big thanks. I’ve been using NeHe’s tutorials for years, and it makes it so much easier to learn a new binding when someone build tutorials with good explanations like yours.

My problem is somewhat strange- This is all working on my box, using both localhost/ and /, but the textures aren’t loading on other machines. I can point to the images and see them, and I added an that showed up fine, but the textures don’t load in the webgl. Any ideas?

Thanks,
Nick

Christine:
You basically have 3 options, I arrange here from simplest but slowest to most complicated but fastest:

1) render the different faces entirely separately. Make one set of buffers for all of the faces that uses the same image. Render the different sets in order.

2) Use one set of buffers, but break it up into different calls to gl.drawElements with different offsets:

I need one more help from you . I search in google but I could not find any good example or references .

I want to use live video stream which is comping from IP camera as texture for each face of cube . Or use motion JPG to grab image from IP camera and refresh the image at lest 10 frame per second . Can you guide me in this task ?

just expanding on what hugues said above concerning the texture RENDER WARNING:

The warning is caused by activating and binding the texture in drawScene() before the corresponding image is loaded. In other words, this warning is triggered every time you render until the image has completed loading.

While you could theoretically do the quick and dirty trick of calling
window.setTimeout(tick, 1000);
at the end of webGLStart(), you’ll still get the warning for slow connections and/or more textures.

My solution works as follows: I removed the call of tick() in webGLStart() and changed initTexture() to this:

This calls tick() for the first time AFTER all the textures (in this case just one) have loaded. Voila!

numTextures and textureImagesLoaded are global variables, but of course they could be stored differently. You could also use some sort of signalling instead of calling tick() directly, but the concept remains the same.

I have been trying to follow your tutorials, but when I download the source code it doesn’t work and I have no idea why. All the lessons source shows the same thing, blank canvas with the 2 “back to lesson” links. anyone else have this problem and does anyone have a fix?

I tried the neheTexture.image.crossOrigin = “anonymous”; but that didnt help.

good stuff.
I made a cube orbiting in a circle with different face rendered diffferent texture.
Bascially the procedure is to add 3 different matrix of indices 3 different pair of faces and each bind each matrix to a different texture.
See the examples in the link below:

Thank you for your lessons! I learned a lot from it.
I test the lesson05 source code on f://html/webgl/lesson05.html. but it doesn’t work.
the info:
Uncaught SecurityError: Failed to execute ‘texImage2D’ on ‘WebGLRenderingContext’: the cross-origin image at file:///F:/html/webGL/nehe.gif may not be loaded.
than I open a http server on localhost, than it works!
the URL is like this:http://localhost:8083/shopping/webgl/lesson5.html
There maybe something wrong with the http protocol or the file protocol !
Webgl loads gif picture failed when work with the file protocol.
Can you find a way to solve it？Many thanks！