Loading and Working with Textures, and Preparing Textures for Use in Your Scenes9.1

Adding Realism with Environment Maps9.2

MeshStandardMaterial and the Metal/Rough Workflow9.3

MeshPhongMaterial and the Specular Workflow9.4

Bump, Normal, and Displacement Maps9.5

Two Ways of Approaching Transparency9.6

Emissive, Light, and Ambient Occlusion Maps9.7

Understanding Geometry10

Basic Geometry Concepts: Vertices, Normals and UVs10.1

Creating A Custom Geometry10.2

Points, Particles Systems, and Sprites11

Particle Systems11.1

Introducing Sprites11.2

Lines, Shapes, and Text12

Throwing Shapes: Recreating the 2D Canvas API in 3D 12.1

Text in 3D: The FontLoader and TextBufferGeometry12.2

Rendering Your Scenes with WebGL13

The WebGLRenderer in Depth13.1

Rendering Offscreen to a WebGLRendererTarget13.2

Animating Your Scenes14

Unraveling the Animation System14.1

Introducing Morph Targets14.2

Bones, Skinning, and Skeletal Animation14.3

Post-Processing, Shaders, and Effects15

Adding Post-Processing To A Scene15.1

Anti-Aliasing A Post-Processed Scene15.2

A Big List of all the Post Effects (currently) Available in three.js15.3

Sound in a 3-Dimensional World16

The WebAudio API16.1

Positional Sound16.1

References and Resources

A BRIEF INTRODUCTION TO TEXTURE MAPPING

So far we have just used a simple colored material for our mesh. If we want to create something more realistic we’ll have to start using a technique called texture mapping.

Put very simply, this means taking a square image and stretching it over the surface of a 3D object.

Of course, this will be very easy to do if the surface of the 3D object is square, and less easy of the surface is curved. Fortunately, the cube that we’ve been using so far will be very easy to apply textures to since every surface is flat.

Start by loading up the code from the previous chapter, and we’ll continue from there.

Mapping a 2D Texture onto a 3D Object Using a UV Map

How do we go about stretching a 2D texture over the surface of a 3D shape?

The answer to that is a technique called UV Mapping.

Let’s take a quick look at how that works now, and then try it out on our spinning cube.

UV Mapping Explained

Our cube geometry looks something like this, where each of the red dots is a
vertex and has a position in 3D space defined by $(x, y, z)$ coordinates. Take a look back at Ch 1.1: Hello Cube! if you need a reminder of what the three.js coordinate system looks like.

We want to take a 2D square texture and map it onto our 3D geometry.

To do so we’ll imagine a 2D coordinate system on top of the texture, with $(0,0)$ in the bottom left and $(1,1)$ in the top right. Since we’ve already used the letters $x$, $y$ and $z$ for our 3D $(x, y, z)$ coordinates, we’ll call these 2D textures coordinates by the letters $(u, v)$, which is where the name UV mapping comes from.

Let’s create a texture now to help us visualize this. We’ll use a simple black and white checker pattern and label a few of the UV coordinates. Once we are done, we’ll have something that looks like this:

It’s a very simple mapping for now since we are just mapping a square texture onto the square face of our cube, and we can use similar mappings for the other five faces.

Once we’ve set them all up, our cube mesh will look like this:

Actually, the BoxBufferGeometry that we are using has set up the mappings automatically for us, so we just need to load the texture and apply it to our material.

Take a few moments to examine the cube and the way that the texture has been mapped onto it now. You can use your mouse or touch screen to move the camera around and zoom in since we’ve added camera controls to the scene - we’ll see how to do this for ourselves in the next chapter.

We’ll come back to UV mapping in much more detail in Section 6: Understanding Geometry, but for now, let’s move on and take a look at how to load a texture.

Add a Texture to Our Material

Load a texture with the TextureLoader and apply it to our material.map slot

constgeometry=newTHREE.BoxBufferGeometry(2,2,2);// create a texture loader.
consttextureLoader=newTHREE.TextureLoader();// Load a texture. See the note in chapter 4 on working locally, or the page
// https://threejs.org/docs/#manual/introduction/How-to-run-things-locally
// if you run into problems here
consttexture=textureLoader.load('textures/uv_test_bw.png');// set the "color space" of the texture
texture.encoding=THREE.sRGBEncoding;// reduce blurring at glancing angles
texture.anisotropy=16;// create a Standard material using the texture we just loaded as a color map
constmaterial=newTHREE.MeshStandardMaterial({map:texture,});// create a Mesh containing the geometry and material
mesh=newTHREE.Mesh(geometry,material);

Loading a texture and applying it to a map slot in a material is very easy in three.js, as long as you are serving your page from a web server. If you’re using CodeSandBox or another online editor to follow along then everything is taken care of for you, but if you are working locally, i.e. loading the files directly from your hard disk, you will run into problems due to security restrictions on how JavaScript can read local files.

Loading Files Locally

We’ll load a texture using the TextureLoader. There are a number of alternatives to this, which we’ll look at in Section 4: Materials and Textures, but using the TextureLoader is by far the most common and easiest method.

Once we’ve loaded and set up the texture, we’ll assign it to the .map slot in our material, and the TextureLoader will take care of all the technicalities involved in loading the texture for us.

Before we proceed, let’s make sure that we’re clear on all the technical terms that we’re using here.

What’s the Difference Between an Image and a Texture?

You’ll see the terms texture and image being thrown around a lot in computer graphics literature. What’s the difference?

Basically, an image is a 2D picture designed to be viewed by a human, while a texture is specially prepared data used for various purposes in 3D graphics - nearly always 2D, but sometimes 1D, 3D, or even 4D!

The confusion arises since 2D textures are stored in the same type of file formats as images, such as JPG, PNG, BMP etc.

We are going to use a texture to set the surface color of our object, so it even looks like an image. However, textures can be used for many other purposes, such as bump maps, opacity maps, light emission maps and lots more besides, and not all of these will look like images of anything in particular to our eyes.

Texture, Texture Map, and Map

These terms are all used pretty much interchangeably, although map is most commonly used when applying a texture to a material.

When we add a texture to a material, we say we are assigning a texture to a map slot on a material. Over the rest of this chapter, we are going to assign the UV test texture above to the color map slot of our MeshStandardMaterial.

1. Load A Texture With The TextureLoader

Create a TextureLoader and then use it to load the texture:

constgeometry=newTHREE.BoxBufferGeometry(2,2,2);// create a texture loader.
consttextureLoader=newTHREE.TextureLoader();// Load a texture. See the note in chapter 4 on working locally, or the page
// https://threejs.org/docs/#manual/introduction/How-to-run-things-locally
// if you run into problems here
consttexture=textureLoader.load('textures/uv_test_bw.png');// set the "color space" of the texture
texture.encoding=THREE.sRGBEncoding;

We’ll use the TextureLoader to load the texture. textureLoader.load returns an instance of Texture that we can immediately use in our material, even though the texture itself may take some time to load.

Asynchronous File Loading

The TextureLoader loads texture files asynchronously. This means that when you call loader.load( 'textures/uv_test_bw.png' ), even if you are on a very slow internet connection and uv_test_bw.png takes half a minute to load, you can still use the texture variable in the meantime - your app will not have to pause to wait for loading to finish.

Note that, while our app is waiting for the texture to load, your material will display as black.

2. Set the Texture’s Parameters

Set the texture’s encoding and anisotropic filtering level

consttexture=textureLoader.load('textures/uv_test_bw.png');// set the "color space" of the texture
texture.encoding=THREE.sRGBEncoding;// reduce blurring at glancing angles
texture.anisotropy=16;// create a Standard material using the texture we just loaded as a color map
constmaterial=newTHREE.MeshStandardMaterial({

We need to tune a couple of settings on the texture. First up is the texture encoding.

Setting the Texture.encoding

Since textures can represent many things, three.js needs to interpret the data in different ways depending on the intended use.

In general, there are just two ways that we’ll need to interpret textures:

The texture represents colors designed to be seen by human eyes

The texture represents something else, such as bumps on a surface

A texture is made up of lots of individual pixels (when we’re dealing with a texture we call these texels) and each of these pixels represents a single color.

To understand why we need to interpret these colors differently in different situations, we’ll need to digress for a moment and introduce the concept of a color space.

In the field of 3D rendering, we generally have to deal with two different color spaces. The first is called sRGB, and it’s the colors space used by colors that end up on your screen. The second is called linear space and it’s the color space used inside the renderer.

The historical reason for these different color spaces is that, in the early days of television when screens used cathode ray tubes, they did not reproduce colors equally well across the whole visual spectrum, so a gamma correction was applied - generally using a gamma factor of $2.2$.

Newer screens, such as LCD or OLED screens, have better color reproduction ability and no longer need to apply gamma correction. However, they still do - first, for backward compatibility, and second, because due to a happy coincidence, gamma corrected colors actually match up quite well to how our eyes see color, meaning that we can store most of the range of human color vision using less data.

sRGB color space is not exactly the same as gamma corrected color space, but it’s close enough for the purpose of this introduction. It’s a standard specification that was created by Microsoft and HP in the early days of the internet as a way of ensuring that color reproduction across devices and printers was universal, and it’s the color space used by basically every modern electronic device.

However, this presents us with a problem, because these gamma corrected colors don’t follow a linear pattern - that is, if we double the amount of blue in a color, we don’t necessarily end up with a color that is twice as blue. This makes doing math with gamma corrected colors much more complicated than it should otherwise be.

Linear Color Space

The solution is to remove the gamma correction from the colors before we pass them into the renderer. Once we remove gamma correction from a color, it will be in linear color space and we can safely do mathematical operations such as doubling the brightness or tripling the amount of blue and get the results that we expect.

The final step in the rendering pipeline is then to convert the colors back into sRGB space, ready to be displayed on our screens.

This is a very brief and intentionally partial introduction to color spaces. It’s a complex topic and one that’s often neglected since if you forget to take gamma correction into account your colors will be just a little bit off, and if you are not paying close attention to your final renderings you may not even notice any difference.

However, our goal here is to create professional quality renderings and this one of the many small pieces of the puzzle that come together to make your final result really stand out.

Going back to our two possibilities above, we can now see why we need to make note of the difference:

The texture represents colors designed to be seen by human eyes: this means that texture is in sRGB color space and needs to be converted to linear space before being used by the renderer

The texture represents something else, such as bumps on a surface: this means that texture is already in linear space and can be used directly

This texture is going to be placed in the color map slot in our materials, so it seems a fair bet that we’re dealing with the first case above.

However, by default textures are assumed to have colors encoded in linear space - so we’ll need to tell the renderer that this texture has colors encoded in sRGB space instead:

texture.encoding=THREE.sRGBEncoding;

Reduce Texture Blurring by Setting the Anistropic Filtering Level

Next up is a parameter which will improve the appearance of nearly every scene that uses textures, although again, this only needs to be applied to textures representing colors designed to be seen by your eyes. This parameter is the anisotropic filtering level, which is stored in texture.anisotropy.

By default, this is set to $1$, which applies no filtering. We will increase this to &16&, which is the maximum level supported by most graphics cards:

texture.anisotropy=16;

Anisotropic Filtering

What does this setting do? It’s easiest to describe with a picture - here’s our cube with a more detailed UV test image applied. You’ll quickly see on the left side that the top of the cube looks blurred, while on the right-hand cube, which has anisotropy set to 16, the top is sharp.

We won’t get into the details of how this works here, but the purpose of anisotropic filtering is to make your textures look good at glancing angles.

The maximum value this setting can have depends on your graphics card but don’t worry if you set it too high. If you set it level $16$, as we have here, but your graphics card supports a maximum level of $8$, it will automatically downgrade it.

Valid anisotropic filtering levels are powers of two - $1$, $2$, $4$, $8$, up to the maximum level of $16$.

Values higher than $16$ are generally not supported as this technique uses a lot of graphics memory and it’s hard to see any further increase in quality at higher levels.

This high memory usage is also the reason why this is not set to a higher value by default. You should only enable this when needed, or use lower values, especially if you need to support mobile devices or integrated GPUs.

Always experiment with your apps to get the best trade-off between visual quality and performance.

three.js Does Not Provide Warnings If You Enter Invalid Parameters!

For a number of reasons, but in particular, due to the need to keep the size of the three.js file as small as possible, you will generally not get any warnings if you enter invalid parameters, anywhere in three.js.

In the case of texture.anisotropy, even though the valid values are $1$, $2$, $4$, $8$, and $16$, you can set the anisotropy to anything that you want and three.js and WebGL will handle it.

It will round odd numbers up so that setting texture.anisotropy = 3 will give the same result as texture.anisotropy = 4, and you can even do crazy things like texture.anisotropy = 'hey there!' and you will still not get any warnings! Everything will still work and the anisotropy will just be set to $1$.

Most objects in three.js work like this, meaning that it is up to you to make sure that you are entering reasonable parameters. However, in general, if you enter an invalid value for a parameter, your app will break in an inexplicable way and you will have to work hard to track it down.

This is always a good chance to hone your debugging skills (remember, console.log is your friend here), and with a bit of practice you can usually track down the line causing the error within a few minutes.

3. Add the Texture to the Material’s Color Map Slot

Assign the loaded texture to the material’s diffuse color map slot

// reduce blurring at glancing angles
texture.anisotropy=16;// create a Standard material using the texture we just loaded as a color map
constmaterial=newTHREE.MeshStandardMaterial({map:texture,});// create a Mesh containing the geometry and material

Now that we’ve successfully loaded our texture, we can assign it to the material.map slot. Once we’ve done so, we should see the texture show up on our spinning cube.

material.map uses a texture to describe how the color of the material changes over the surface of the object.

Our material has quite a few maps slots, but .map is the most commonly used and important map, so even though it should be called “colorMap” or something similar, this gets shortened to just .map.

Some of the other important map slots are:

material.normalMap, which can hold a texture saying how bumpy the object is over its surface

material.emissiveMap, which can hold a texture saying how much light an object emits over its surface

material.alphaMap which can hold a texture saying how see-through a material is over its surface

… and many others

Different material types may have different map slots. Make sure to check the docs page for the material you are using to see all the available slots.

Note that all map slots on all material types are optional - you can use any kind of three.js material without loading any textures.

We’ve also removed the material’s .color parameter. Remember, removing this will reset the color back to the default, which is white.

We’re doing this because the material’s color gets combined (multiplied, technically) with the material’s texture, so if we left it as purple, the texture would have a purple tint. However, multiplying a color with white has no effect in the same way that multiplying a number by one has no effect.

Reduce the Brightness of the Light

Reduce the light’s intensity from 5.0 to 3.0

// add the mesh to the scene object
scene.add(mesh);// Create a directional light
constlight=newTHREE.DirectionalLight(0xffffff,3.0);// move the light back and up a bit

Now that we’ve put a texture on our cube, the light seems very bright, so we’ll reduce its .intensity from 5.0 to 3.0.

We’re back to using an unnamed “bare” parameter here, so to make this change we need to remember (or, more likely, check the
docs to remind ourselves) that the second number passed to the DirectionalLight constructor is the intensity.

Set the Renderers GammaFactor and GammaOutput

Set the correct gamma correction factor and color space on the renderer

// create a WebGLRenderer and set its width and height
renderer=newTHREE.WebGLRenderer({antialias:true});renderer.setSize(container.clientWidth,container.clientHeight);renderer.setPixelRatio(window.devicePixelRatio);// set the gamma correction so that output colors look
// correct on our screens
renderer.gammaFactor=2.2;renderer.gammaOutput=true;// add the automatically created <canvas> element to the page
container.appendChild(renderer.domElement);

We’re nearly done. However, since we’ve set up the texture to use the correct color space, we should also do the same for our WebGLRenderer.

Note that for our current scene, we won’t see any difference since we’re using this simple black and white UV test texture. Black and white are not affected by color correction, so changing these settings has no effect here.

However, once we come to more advanced scenes this will make a difference and we will already be set up and following best practices.

Final Result

Here’s our textured cube, happily spinning away. As we mentioned above, it may look black for a few seconds as you wait for the texture to download, which is normal.

It’s hard to examine closely while it’s constantly tumbling like that, so in the next chapter, we’ll add some interactivity with camera controls. These will allow us to pan, rotate and zoom/dolly the camera to get a view of our scene from any angle.