Simple GLSL Bump-Mapping / Normal-Mapping

I had an hour to kill earlier on so thought I’d take a shot at some bump-mapping/normal-mapping (whatever you want to call it) – and it’s actually not too hard. Well, it’s probably kinda hard to do properly, but it’s pretty easy to get something that works without going into TBN (Tangent/Binormal/Normal) space, like this…

You can use lots of different tools to generate normal maps, I wanted something quick and free, so I used the GIMP, specifically the GIMP Normal-Map Plugin. Just load the texture image, then select Filters | Map | Normalmap… from the menu and and save that bad boy for later use. I upped the scale of the normal-map generation to 10.000 from it’s original 1.000 just to magnify the effect, you can do the same, or you can magnify it in the fragment shader, but it’s probably going to run quicker if you encode the normal-map with the magnitude of normal perturbation you’re going to use in your final work.

Hint: If you’re on Ubuntu, just install the package gimp-plugin-registry – it comes with a stack of neat & useful plugins, including Normal Map. I tried to install it separately and it conflicted with the plugin-registry package, which would be because it’s already a part of it and I had all the tools I needed already!

Original Texture on the left, Normal-Map version on the right

Notice how the normal-mapped version of the texture on the right is mainly blue? This is because the data being stored in it isn’t really “colour” anymore – instead of thinking of each value as a RGB triplet, think of it as an XYZ vector. If we look at the data that way, what we’re really seeing when we see “blue” is nothing on the X and Y axis’, and positive on the Z axis. Sneaky, huh?

Now that you’ve got your original texture and the normal-mapped version of it, texture your geometry as usual but add an additional Sampler2D uniform in your fragment shader for your normal-map texture. Bind to another texture unit before loading and glTexImage2D-ing it so you have the texture on one texture image unit and your normal map on another (so you can sample from each independently).

Once you’ve got that all sorted, use shaders along the lines of these:

Vertex Shader

C++

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

#version 330

// Incoming per-vertex attribute values

invec4 vVertex;

invec3 vNormal;

invec4 vTexture;

uniform mat4 mvpMatrix;

uniform mat4 mvMatrix;

uniform mat3 normalMatrix;

uniform vec3 vLightPosition;

// Outgoing normal and light direction to fragment shader

smooth out vec3 vVaryingNormal;

smooth out vec3 vVaryingLightDir;

smooth out vec2 vTexCoords;

voidmain(void)

{

// Get surface normal in eye coordinates and pass them through to the fragment shader

vVaryingNormal=normalMatrix*vNormal;

// Get vertex position in eye coordinates

vec4 vPosition4=mvMatrix*vVertex;

vec3 vPosition3=vPosition4.xyz/vPosition4.w;

// Get vector to light source

vVaryingLightDir=normalize(vLightPosition-vPosition3);

// Pass the texture coordinates through the vertex shader so they get smoothly interpolated

vTexCoords=vTexture.st;

// Transform the geometry through the modelview-projection matrix

gl_Position=mvpMatrix*vVertex;

}

Fragment Shader

C++

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

#version 330

// Uniforms

uniform vec4 ambientColour;

uniform vec4 diffuseColour;

uniform vec4 specularColour;

uniform sampler2D colourMap;// This is the original texture

uniform sampler2D normalMap;// This is the normal-mapped version of our texture

// Input from our vertex shader

smooth invec3 vVaryingNormal;

smooth invec3 vVaryingLightDir;

smooth invec2 vTexCoords;

// Output fragments

out vec4 vFragColour;

voidmain(void)

{

constfloatmaxVariance=2.0;// Mess around with this value to increase/decrease normal perturbation

constfloatminVariance=maxVariance/2.0;

// Create a normal which is our standard normal + the normal map perturbation (which is going to be either positive or negative)

8 thoughts on “Simple GLSL Bump-Mapping / Normal-Mapping”

Hi your bump mapping shader does not work on my old ati card. I however have found one somebody wrote and it does work with one exception i get flashing textures. This greatly reduces the effect and does not look professional. Can you please take a look and make some minor modifications to the shader programs please.

My shaders and source code probably don’t work on your old ATI card because of the GLSL version – it’s likely that your old ATI card doesn’t support GLSL #330, but it will probably support #120 or #130 or #140 or whatever.

Give me the link to where you got the shaders you provided and I’ll take a look and see what I can do.

I have attempted to implement this using g-truc’s GLM, and I’ve noticed that this shader draws the geometry only in white. I have adjusted color, etc.

My implementation had to vary slightly by sending the M and V components separately and multiplying them, otherwise its precisely the same.

Question regarding VBO binding for GLSL’s in vecs in the vertex shader, is there any trouble with using glVertexAtrib related functions and treating this as an attrib? (I am assuming that’s ok) Another question is, do you have to use the older-style glAttribArray types of GL_TEX_COORD_ARRAY etc or can you use bind to them as above?

The camera is constructed with GLM’s camera used in opengl-tutorial #13. This tutorial seems simpler than that tutorial in its approach on purpose, but I’m not sure how to make it go with the same math setup.

I’m afraid I don’t know the answer to your attribute questions – I’ve only ever used the helper code from the OpenGL SuperBible to bind attributes and such when I was playing with shaders

However, I’ve wanted to know the deal is with GLM for a while, so I’ve basically taken the day to hack around and get it to play ball. As such, you can find a working version of the bump mapping source which uses GLM for all matrix and vector math here. It still uses the SuperBible shader loader and vertex generation for the torus’.

It’s was hacky to being with, and then I bolted on the camera… and now it’s really hacky ;-) Use W/S/A/D and the mouse to move around, the light source is in eye coordinates so attached to the camera, you’ll have to convert the light eye-coords into world coords to fix it in place.

It was only through email that the link was not found, anyway. Thanks for getting back to me, I’ll try this.

Question about the geometry:

I get weird output and the light seems to penetrate the back faces, is there an easy way to avoid that when moving the camera around? The weird output is just that the sides are flipped or otherwise wrong. Do texcoords effect this, or normals, or both? I’ll try using this to see if it fixes the issues.

With the “light penetrating the back faces” – I think that this is just due to the light being specified in eye coordinates, so when you move, the light also moves – it’s fixed in relation to the camera.

As for the “sides are flipped or otherwise wrong” – I’m not entirely sure what you mean, sorry – maybe send me a screenshot at mail at r3dux dot org and I’ll take a look.

When implementing this in my engine, it actually turned out I had cross-mapped my normals to my texcoords when using glVertexAttribPtr, so I was seeing texcoords as only solid colors. I debugged this by setting glFragColor=final.r,UV.x,UVy .. I use that camera I provided, but now it can be simplified. However, it was not yours that I used, I used the #13 from OpenGL Tutorial . org and set it up for #version 120, though 330 would be fine too