I`ve spent quite a bit of time in the last few weeks compiling bits of different articles to figure out how to implement normal mapping (important task in creating a 3D configurator) in the Lair Engine. The problem is that the methods and terminology used in Normal Mapping articles are very different. This makes it a very confusing topic for non-Mathe lovers like me. In the following article I will explain three common techniques for Normal Mapping for mathematical laymen like myself.

The following video explains what Normal Mapping actually is:

For privacy reasons YouTube needs your permission to be loaded. For more details, please see our Datenschutzerklärung.
I Accept

Originally, I started implementing normal maps without precalculated tangents, which sounded like the easiest way, but probably wasn`t the best place to start. There were significant problems getting it to work during implementation. The biggest problem was the light turning even when the whole model was turned.

After a few days of frustrating debugging, I went over to other things for a few weeks and read articles about Normal Mapping in my spare time. When I thought I understood it, I decided to integrate this code to calculate tangent vectors for any mesh, and shader code to perform normal mapping in view space. To my disappointment, this implementation had exactly the same problem as the original one: The lighting turned with the model. At this point I was a little astonished, so I asked a question in the forum and got two helpful answers:

  1. Nobody pointed out any problems with my math.
  2. The suggestion to implement normal mapping in space so that you can render the various elements on the screen while debugging and hope to find out what`s going on.

Coordination spaces.

When you do 3D rendering, there are many different coordinate areas that you have to deal with. In Normal Mapping, the chain of transformations looks like this:

Copy to Clipboard

Tangent Space is one we`re interested in today. It is the coordinate space in which the normals are located in a normal map.

The TBN matrix.

Every article you read about Normal Mapping talks about the TBN matrix. It is named after the elements it consists of, the vectors Tangent, Bitangent and Normal. The TBN matrix allows them to convert normal map to the model space. This is, so to speak, the simplest task it does.

To calculate a TBN matrix from an area and a tangent, you need GLSL code as follows:

Copy to Clipboard

The normal you have seen a million times is a vector perpendicular to the face in the model room. Tangent points along the positive U-texture coordinate axis for the surface. To calculate the bitangent, we take the cross product of the normal and tangential vectors and then multiply it by a constant in tangent. W, which is the handedness of the tangential space. The bitangent points along the V-texture coordinate axis of the plane. In the following example, a Textur Mapped Cube was created to debug Normal Mapping.

This TBN matrix is not very helpful for us now, because it is about the illumination and it is only about the conversion of normals from the tangent space into the model space. In order to achieve a greater benefit, we can construct it as follows:

Copy to Clipboard

By multiplying each vector with the model matrix, we obtain a TBN that converts from tangent space into space. You will notice that when we generate a Vec4 from the normals, tangent and bitangent, we generate 0.0 for the w value and not 1.0. The reason for this is that there is no translation in the model matrix because it makes no sense to translate direction vectors. Of course this brings us to it:

World Space Normal Mapping.

The idea with World Space Normal Mapping is to convert a normal from the normal map and the direction vector to the light source in World Space so that we can take a point product from the normal map and the light direction to get the size of the diffusely reflecting light or Lambertian value.

Copy to Clipboard

So in the complete sample code for World Space Normal Mapping below I generate a TBN matrix that converts from Tangent Space to World Space and a lightDirection vector in World Space in the Fragment shader. In the Vertex shader I use the TBN matrix to convert the normal from the normal map of Tangent Space to World Space and to puncture it in space with the normalized lightDirection.

You might have noticed this in the fragment shader:

Copy to Clipboard

Since the data in the normal map should be stored in the range [0,0 – 1.0], you must scale back to the original range [-1.0 -1.0]. If we are convinced of the accuracy of our data, we could do away with normalization here.

In the whole example code I generate only the Lambert element of the complete ADS (Ambient/Diffuse/Specular) Lighting Pipeline. I then render this value as grayscale so that you can see how the normal map will contribute to the final image. I hope that you will understand the term Lambert well so that you can incorporate it into the complete ADS lighting yourself. The viewMatrix, modelMatrix, normalMatrix, modelViewMatrix etc. are equivalent to the obsolete OpenGL GL_NormalMatrix, GL_ModelViewProjectionMatrix etc.

World Space Normal Mapping Vertex Shader.

Copy to Clipboard

World Space Normal Mapping Fragment Shader.

Copy to Clipboard

So the great thing about working in World Space is that you can check the tangent, bitangent and normal values by passing them to the question shader and rendering them on the screen and then looking at what color they are. If a face ends in red, then the vector you render on that face will show along the positive X-axis. You can do the same for Normal Maps values. Note that these vectors cann also have negative values. So if you don`t scale back to the [0.0 – 1.0] range, you will see some black polygons.

I built a cube model and then spent most of an afternoon checking every single input into the shader to find out why the light rotated with the camera. In the end, I found out that wasn`t really what happened.

Tangent Space Normal Maps.

If you look for the Normal Map on Google Images, you will see many powder blue images. These are Tangent Space Normal Maps. The reason for the bluish color is that the upward vector for a normal map is on the positive Z-axis stored in the blue channel of a bitmap. To view a bitmap of the Tangent Space, flat is powder blue. Normally those pointing up are cyan and those pointing down are mangenta.

There is no kind of standardization for Normal Maps, so some have only two data channels, some have the Normals aligned vertically in the opposite direction, so take a close look at the Normal Maps to make sure they are in the format you except. There are also bump maps that are often green. These are usually completely different and it is not recommended to use them.

View Space Normal Mapping.

Once I had a good handling with the World Space Normal Mapping, I also wanted to try the View Space Normal Mapping. The idea here is the same as with normal World Space Mapping, only this time we convert the vectors to View Space. The reason for this is that you can shift more work to the Vertex shader and simplify some calculations.

So calculate like our TBN again:

Copy to Clipboard

This is a little different. First we multiply with the normalMatrix to convert the normal, tangent and bitangent values to View Space. Since the normal matrix is already 3×3, we don`t need to use the 0.0 trick that im did in World Space. Next we make a mat3 out of t, b and n, but this time we make a transposition on it. The reansposition reverses the effect of the TBN matrix so that it is now converted from View Space to Tangent Space Instead of Tangent Space to View Space. There is a mathematical reason why this works in this case. This trick does not work in all matrices.

What we do with this backward TBN matrix is to convert the direction into the light source vector from World Space to View Space and then use the TBN matrix to convert it back  into Tangent Space.

Copy to Clipboard

Tricky! Our LightDirection vector is now in Tangent Space, the same space as our normal map vectors.

Now you will notice that the TBN construction code is commented out at the top of the shader at the bottom. This is because there is a way to make this calculation a little easier:

Copy to Clipboard

With our tricky LightDirection vector in Tangent Space, the fragment shader is super simple and fast.

Copy to Clipboard

View Space Normal Mapping Vertex Shader.

Copy to Clipboard

View Space Normal Mapping Fragment Shader.

Copy to Clipboard

Normal Mapping without precalculated tangents.

Once I had put the normal mapping of View Space into operation, it was no longer necessary to make the precalculated, tangential-free normal mapping run. This does the normal mapping in World Space as in the first example, but it calculated the tangent and bitangent in the fragment shader. I can`t see any significant visual difference between the result of the precalculated tangents, but their performance can vary.

There are significantly more GPU calculations in the precalculated, tangential-free implementation, but you don`t have to transfer a 12-byte vertex attribute to the GPU, so the choice of graphics card really depends on your platform and other rendering load. On some mobile platforms, the precalculated, tangential-free implementation appears to be much slower. I will continue to calculate the tangent offline and pass it to the vertex shader as a vertex attribute, as some of my other shaders have already heavily loaded the GPU. However, I keep this implementation in case I have a model with a small normally mapped element amd the rest is not normally mapped.

Normal mapping without precalculated tangents vertex shaders.

Copy to Clipboard

Normal Mapping without precalculated Tangents Fragment Shader.

Copy to Clipboard

Mystery solved.

I have found the solution to my problem with the rotating light:

Copy to Clipboard

Suddenly I began to get a flat, shady light that looked as expected. The light stopped turning.

Gamma and Normal Maps.

The Normal Maps I used were PNG files. This was the first one I created with Gimp for debugging and it was a completely flat surface that was supposed to give me a flat shading if it worked correctly. I downloaded the second one from the internet to test if it worked properly. It turned out that both images and the same problem. Both images had a gamma value of 2.2 stored in their PNG files, but the data in the files was actually gamma 1.0. When OpenGL transferred the normal maps to the video card, it automatically converted them from SRGB space to linear space, converting all the normals contained in them. This isn`t the first time I`ve had this problem with PNG files, so it was time to create a tool. I wrote a small utility to load a PNG, change the gamma value without changing the data, and then write a new PNG.