A normal map is an image that encodes the surface normal vector at each pixel. The RGB values store the XYZ components of the vectors. The RGB range from 0 to 1 corresponds to the XYZ range from -1 to 1, therefore RGB 0.5 (neutral gray) stands for the null vector. The following example shows the normal map for a sphere.
With these normal maps we can dynamically light our sprites by a process called "DotProduct3 bump mapping" to make them appear more 3D. The term refers to the mathematical operation which is used to calculate the intensity of the reflected light at each pixel: the dot product of the surface normal vector and the light vector. The dot product is the product of both vector's lengths and the cosine of the angle between them. The following figure shows the cosine function (90 degrees = pi/2).
So, if the angle between the two vectors is 0, i.e. the light is shining onto the surface directly from above, the intensity of the reflected light at that point is maximal. If the incident angle of the light is 90 degrees (pi/2), the intensity is minimal. This calculation is made for each pixel by DirectX.
Enough theory! Let's see how it looks in practice. The following images are screen captures from the DirectX DotProduct3 demo with familiar objects. The first one shows the calculated intensities for light shining from the left hand side, the second one for light shining from the right hand side. The third image shows the corresponding normal map.
For the dynamic lighting in our game I need to render a normal map for each sprite. Several approaches are imaginable in Blender.
The easiest way would be a one-click option in Blender to render not the shaded objects but their surface normal vectors at each pixel. Unfortunately that doesn't exist (as yet). I don't know if a sequence plugin could do this. At least I haven't come across one until now.
What comes close, is a Python script called G-Buffer Extractor. It can store the normal vectors as vertex colors and renders them to an image. The problem is that it doesn't take fine detail like bump maps or displacement maps into account.
Another option could be a RenderMan export script plus a shader that renders the normal vectors to RGB colors in one of the freeware RenderMan renderers. Basically, such a shader is simple to program, but since I want it to take bump maps and displacement maps into account, it becomes more difficult.
A good method was suggested by Martin Poirier (aka theeth). He uses three texture channels of a material for red, green and blue "Blend" textures that are mapped to X, Y and Z with the "Nor" mapping type. But this method involves more material editing than mine.
Finally I found a quite simple way that produces very good results. When you look at normal maps, you might notice that they look as if a red light source was left of the object, a green one on top and blue one in front. And that's exactly how I'm creating normal maps at the moment.
Starting from the shaded object which is rendered to a sprite, I make a linked copy of that object with "ALT-D" and move it to another layer. I give this object (not the mesh) a new material which copies all features from the existing one especially bump maps and displacement maps. Then I change the color to plain white (RGB 1.0) with maximum diffuse reflection (Ref 1.0) and set everything else to 0.
Now the interesting part comes: the light setup. At first, I place the cursor at the origin by "SHIFT-C" and add three lights at the cursor position. With movement restricted to the respective axis, I move the first light 10 units to the left, the second light 10 units to the top and the third one 10 units to the front. Then I change the light type to "Hemi" and make the first light red, the second one green and the third one blue. The following Blender screenshot shows the result with a sphere as object.
Hemi lights are best suited because the fall-off doesn't stop halfway as e.g. with "Sun", which leads to wrong results. You can see this in the figures below: on the left is the normal map rendered with Sun lights, on the right the resulting wrong lighting.
I don't know if the Hemi light distribution represents the normal vectors mathematically correct, but from an artist's point of view it works perfectly. You can see this in the figures below, which show the same situation as above, but with a normal map created with Hemi lights.
Of course, I made a template with complete light and camera setup for shaded rendering and normal map rendering, so that I don't have to do the described steps over and over again for each sprite.
I'd like to finish this "Making Of" with a small gallery of the normal maps for some of the sprites you already know from the previous page.