7

I got a normal mapping issue. I have a texture and a normal texture on each model loaded via the ASSIMP library. I am calculating the tangent vectors on each object with the help of the ASSIMP library so these should be fine. The objects work perfectly with normal mapping but as soon as I start translating one of the objects (thus influence the Model matrix with translations) the lighting fails. As you can see on the image, the floor (which is translated down the y axis) seems to lose most of its diffuse lighting and its specular lighting is in the wrong direction (it should be between the lightbulb and the player position)

Normal mapping gone wrong

It might have something to do with the normal matrix (although translations should be lost), maybe something with a wrong matrix used in the shaders. I am out of ideas and was hoping you could shed some insight into the issue.

Vertex shader:

#version 330

layout(location = 0) in vec3 position;
layout(location = 1) in vec3 normal;
layout(location = 2) in vec3 tangent;
layout(location = 3) in vec3 color;
layout(location = 4) in vec2 texCoord;

// fragment pass through
out vec3 Position;
out vec3 Normal;
out vec3 Tangent;
out vec3 Color;
out vec2 TexCoord;

out vec3 TangentSurface2Light;
out vec3 TangentSurface2View;

uniform vec3 lightPos;
uniform vec3 playerPos;

// vertex transformation
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;

void main()
{
    mat3 normalMatrix = mat3(transpose(inverse(model))); 
    Position = vec3(model * vec4(position, 1.0)); 
    Normal = normalMatrix * normal;
    Tangent = tangent;
    Color = color;
    TexCoord = texCoord;

    gl_Position = projection * view * model * vec4(position, 1.0);

    // Calculate tangent matrix and calculate fragment bump mapping coord space.
    vec3 light = lightPos;
    vec3 n = normalize(normalMatrix * normal);
    vec3 t = normalize(normalMatrix * tangent);
    vec3 b = cross(n, t);
    // create matrix for tangent (from vertex to tangent-space)
    mat3 mat = mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z);
    vec3 vector = normalize(light - Position);
    TangentSurface2Light = mat * vector;
    vector = normalize(playerPos - Position);
    TangentSurface2View = mat * vector;
}

Fragment Shader

    #version 330

in vec3 Position;
in vec3 Normal;
in vec3 Tangent;
in vec3 Color;
in vec2 TexCoord;

in vec3 TangentSurface2Light;
in vec3 TangentSurface2View;

out vec4 outColor;

uniform vec3 lightPos;
uniform vec3 playerPos;
uniform mat4 view;
uniform sampler2D texture0;
uniform sampler2D texture_normal; // normal

uniform float repeatFactor = 1;

void main()
{   
    vec4 texColor = texture(texture0, TexCoord * repeatFactor);
    vec4 matColor = vec4(Color, 1.0);
    vec3 light = vec3(vec4(lightPos, 1.0));
    float dist = length(light - Position);
    // float att = 1.0 / (1.0 + 0.01 * dist + 0.001 * dist * dist);
    float att = 1.0;
    // Ambient
    vec4 ambient = vec4(0.2);
    // Diffuse
    // vec3 surface2light = normalize(light - Position);
    vec3 surface2light = normalize(TangentSurface2Light);
    // vec3 norm = normalize(Normal); 
    vec3 norm = normalize(texture(texture_normal, TexCoord * repeatFactor).xyz * 2.0 - 1.0); 
    float contribution = max(dot(norm, surface2light), 0.0);
    vec4 diffuse = contribution * vec4(0.6);
    // Specular
    // vec3 surf2view = normalize(-Position); // Player is always at position 0
    vec3 surf2view = normalize(TangentSurface2View);
    vec3 reflection = reflect(-surface2light, norm); // reflection vector
    float specContribution = pow(max(dot(surf2view, reflection), 0.0), 32);
    vec4 specular = vec4(1.0) * specContribution;

    outColor = (ambient + (diffuse * att)+ (specular * pow(att, 3))) * texColor;
    // outColor = vec4(Color, 1.0) * texture(texture0, TexCoord);
}

EDIT

Edited the shader code to calculate everything in world space instead of pingponging between world and camera space (easier to understand and less error-prone).

Joey Dewd
  • 1,734
  • 3
  • 16
  • 43
  • **Wrong place to ask.** You'll probably get no answers, unless you're *extremely* lucky. Ask on http://gamedev.stackexchange.com or http://www.gamedev.net – gifnoc-gkp Aug 15 '13 at 08:40
  • @TheOtherGuy: Thanks for the suggestion. I'll try posting it there as well, since this is closely game related I will probably have more luck there :) – Joey Dewd Aug 15 '13 at 08:42
  • 3
    @TheOtherGuy I have a hard time seeing how this is in any way game-specific. This is the perfect place to ask here. – Christian Rau Aug 15 '13 at 14:38
  • @ChristianRau Oh is it? Where are the answers then? It is not directly related to game design but designers use those techniques in the process of making games. Therefore chances are that he'd get better answers there. If you had ***any*** game designing experience you'd understand. – gifnoc-gkp Aug 15 '13 at 15:47
  • @TheOtherGuy Oh no, the question is really already 7 hours old and there are still no answers yet? Well, seems you're right and this is the totally wrong place to post it. In fact I **don't** have **any** game designing experience and still **perfectly understand** what this question is talking about, that's the whole point. In fact designers use those techniques, they don't implement them, programmers like *Joey Dewd* do and for those this site is. – Christian Rau Aug 15 '13 at 16:02
  • @TheOtherGuy I'm not arguing that it wouldn't work on [gamedev.se] either. But given that questions like this one are answered here on a daily basis, it's hard to believe that he'll *"probably get no answers"*. – Christian Rau Aug 15 '13 at 16:09
  • @TheOtherGuy They don't need programming experience to tell why mapping normals could possibly interfere with the shader implementation. – gifnoc-gkp Aug 15 '13 at 16:11
  • @ChristianRau By the way, gamedev.stackexchange.com is like a ghost-town. gamedev.net is much more active. – gifnoc-gkp Aug 15 '13 at 16:11
  • @TheOtherGuy *"By the way, gamedev.stackexchange.com is like a ghost-town."* - Oh really? Sorry, but that just get's stranger and stranger. – Christian Rau Aug 15 '13 at 16:13
  • 1
    *"They don't need programming experience to tell why mapping normals could possibly interfere with the shader implementation."* - It's not about telling what could possibly be his problem. It's about saying what he does wrong in his shaders or the rest of his code in particular. Otherwise I'd just answer *"it seems your normal mapping doesn't work, man"*. – Christian Rau Aug 15 '13 at 16:14
  • I find the most helpful thing is to start setting the fragment colour to various vectors used in the lighting calculations. Eventually you'll find an inconsistency. I can't spot anything obviously wrong in the code. Can you double check that the light position is definitely in world space (before the conversion to eye space in the shader)? – jozxyqk Aug 15 '13 at 19:55
  • @jozxyqk: You're right, setting the frag color to various vectors helps alot :) I found out that the norm vector is acting just fine (stays the same, and looks like normal map). However, when I set the surface2light or surf2view as the outColor the colors change rapidly over the surfaces whenever I move the camera so I guess something is wrong there. The light coords are hardcoded in the code in world coordinates (no translation/rotation or scaling is done on the light) so they are definitely in world space :) – Joey Dewd Aug 15 '13 at 21:26
  • @jozxyqk: I just found out the surf2view colors should change when you move the camera which they do. I also found out that the surface2light colors don't change and are thus independent of camera transformations which should be correct as well. Debugging these showed me that the results are as they should be? – Joey Dewd Aug 15 '13 at 21:35
  • Instead of transforming all of your other vectors out of tangent space (which they are not in to begin with), why do you not simply transform the normal sampled from your normal map into view space using the TBN matrix? This is how 99.999% of tangent-space normal mapping shaders I have ever seen work :) You can pass the TBN matrix to your fragment shader pretty easily if you use `flat out mat3 TBN;` and then multiply the sampled normal by this matrix to get it into view space. – Andon M. Coleman Aug 15 '13 at 22:25
  • @AndonM.Coleman: You are absolutely right. I now calculate the TBN matrix in the vertex shader and pass it to the fragment shader. Then I calculate the tangentSurface2View and tangentSurface2Light vectors in the fragment shader (using the interpolated values of Position) and my lighting seems to work correctly now :) I will do some further testing to make sure everything works correctly :). If you will make an answer of this I will mark it as the correct answer to my issue. Thank you – Joey Dewd Aug 16 '13 at 09:08
  • @AndonM.Coleman: The TBN will change from vertex to vertex, so I wouldn't use "flat". A good reason not to do this is to save the expense of a matrix multiply per fragment. If your models are very high poly the difference is less, but using normal mapping to begin with implies the model is not high poly. However, using the TBN inverse (transpose since it's orthonormal) in the fragment shader might be a good option for debugging. What's "playerPos"? If it's the camera's position in eye space it should be (0,0,0) and unnecessary. – jozxyqk Aug 16 '13 at 09:12
  • @jozxyqk: yeah, my objects are low poly and especially the large floor (it's basically a large cube with only 8 vertices) so it makes sense that the lighting breaks on the floor. Using the TBN in the fragment shader does the trick. After the discussion of celestis I am now doing all my calculations in World Space. Since I am in world space now, using vec3(0.0) as the player position only works when I'm in camera space so I had to add it as an uniform (playerPosition in world space). – Joey Dewd Aug 16 '13 at 09:22
  • @jozxyqk Yeah, I do so much work with deferred shading these days I rarely give much thought to the expense of a matrix multiply in the fragment shader. It comes with the territory, you _have_ to transform the normal map out of tangent space in the fragment shader no matter what in deferred shading :) But it's only done once for the entire scene in deferred shading, so it is a nice trade-off. – Andon M. Coleman Aug 16 '13 at 14:12
  • @AndonM.Coleman: What jozxyqk (and you) are saying is that its quite expensive to multiply the TBN matrix with all the fragments in the fragment shader, but it is necessery right? It now works just fine in the fragment shader, or is there some trick/technique to still do it more efficient? – Joey Dewd Aug 16 '13 at 14:23
  • Yes, doing the matrix multiplication in the fragment shader can be quite expensive, particularly in a forward shading engine. There are other ways, you were sort of on the right track with your original shader - if you properly transform all of your other vectors into tangent space then you can use the normal from the tangent space normal map as-is in the fragment shader. But you were using the matrix the wrong way for this purpose, try using the inverse. You want to go from object->tangent instead of tangent->object. – Andon M. Coleman Aug 16 '13 at 17:39
  • Also note that, if your Tangent, Bitangent and Normal are orthogonal, the inverse matrix will also be the transpose. So you can save a lot of work computing the inverse matrix in this special (though quite common) case. – Andon M. Coleman Aug 16 '13 at 17:52
  • @AndonM.Coleman: Oké awesome, I will do some more reading into tangent matrices/space and try to calculate the normal vectors in the vertex shader for performances :) Thanks for the suggestions and explanations! – Joey Dewd Aug 16 '13 at 22:36

1 Answers1

4

You are making strange manipulations with matrices. In VS you transform normal (that is model-space) by inverse view-world. That doesn't make any sense. It may be easier to do calculations in world-space. I've got some working sample code, but it uses a bit different naming.

Vertex shader:

void main_vs(in A2V input, out V2P output) 
{
    output.position = mul(input.position, _worldViewProjection);
    output.normal = input.normal;
    output.binormal = input.binormal;
    output.tangent = input.tangent;
    output.positionWorld = mul(input.position, _world);
    output.tex = input.tex;
}

Here we transform position to projection(screen)-space, TBN is left in model-space, they will be used later. Also we get world-space position for lighting evaluation.

Pixel shader:

void main_ps(in V2P input, out float4 output : SV_Target)
{
    float3x3 tbn = float3x3(input.tangent, -input.binormal, input.normal);

    //extract & decode normal:
    float3 texNormal = _normalTexture.Sample(_normalSampler, input.tex).xyz * 2 - 1;

    //now transform TBN-space texNormal to world space:
    float3 normal = mul(texNormal, tbn);
    normal = normalize(mul(normal, _world));

    float3 lightDirection = -_lightPosition.xyz;//directional
    float3 viewDirection = normalize(input.positionWorld - _camera);
    float3 reflectedLight = reflect(lightDirection, normal);

    float diffuseIntensity = dot(normal, lightDirection);
    float specularIntensity = max(0, dot(reflectedLight, viewDirection)*1.3);

    output = ((_ambient + diffuseIntensity * _diffuse) * _texture.Sample(_sampler, input.tex) 
        + pow(specularIntensity, 7) * float4(1,1,1,1)) * _lightColor;
}

Here I use directional light, you should do something like

float3 lightDirection = normalize(input.positionWorld - _lightPosition.xyz);//omni

Here we first have normal from texture, that is in TBN-space. Then we apply TBN matrix to transform it to model-space. Then apply world matrix to transform it to world-space, were we already have light position, eye, etc.

Some other shader code, ommitted above (DX11, but it's easy to translate):

cbuffer ViewTranforms
{
    row_major matrix _worldViewProjection;
    row_major matrix _world;
    float3 _camera;
};

cbuffer BumpData
{
    float4 _ambient;
    float4 _diffuse;
};

cbuffer Textures
{
    texture2D _texture;
    SamplerState _sampler;

    texture2D _normalTexture;
    SamplerState _normalSampler;
};

cbuffer Light
{
    float4 _lightPosition;
    float4 _lightColor;
};

//------------------------------------

struct A2V
{
    float4 position : POSITION;
    float3 normal : NORMAL;
    float3 binormal : BINORMAL;
    float3 tangent : TANGENT;
    float2 tex : TEXCOORD;
};

struct V2P
{
    float4 position : SV_POSITION;
    float3 normal : NORMAL;
    float3 binormal : BINORMAL;
    float3 tangent : TANGENT;
    float3 positionWorld : NORMAL1;
    float2 tex : TEXCOORD;
};

Also, here I use precomputed binormal: you shall leave your code, that computes it (via cross(normal, tangent)). Hope this helps.

Celestis
  • 513
  • 1
  • 4
  • 10
  • Thanks for posting some shader code, it could help me find out differences in normal mapping calculations. However, I cannot see any normalMatrix calculations on your normal vectors in the vertex shader. AFAIK you should use a normal matrix (tranpose-inverse matrix of model(view)matrix without translation component) or am I missing something? Currently I'm using the Model AND view component for the normal matrix, but I'm still not sure if it should be just the model or the modelviewmatrix (modelviewmatrix provides good results when not translating, model doesn't). – Joey Dewd Aug 15 '13 at 16:39
  • My comment is too long, so i'll split it. Do you understand, WHY you need inverse M(V) matrix? The thing I tried to explain, is that you can perform calculations in different spaces (world, model, TBN, view or even screen). To do so, you need to transform your data, so that all components (eye, light direction, vertex pos, etc) are in the same space, or it will return rubbish. – Celestis Aug 15 '13 at 17:23
  • Example: you have light in world-space and a vertex and normal in model space. What do you do to calculate diffuse lighting? You multiply both normal and vertex by model matrix - this transforms normal and vertex from model to world-space. Only then you do dot(light, normal). The inverse transform does it backwards. You could also transform light from world to model-space, via multiplying light position by inverse model matrix. Just different space to compute all values. – Celestis Aug 15 '13 at 17:24
  • But you use inverse(view * model) and do Normal = normalMatrix * normal; What does this mean? Simple: you assume that normal comes from model-view-space (camera), and you use matrix to transform it to... hmm... model-space??? what's the point of this? No sense. Hope it revealed something) – Celestis Aug 15 '13 at 17:25
  • I understand the logic. I'm sorry if I'm confusing from time to time, I'm still not used to visualizing all the coordinate spaces. If I got things right: I want to do my calculations in Camera space, so the light and position vectors should be fine (light coordinates are already in world space). So the only thing that should change would be the normal matrix transforming it to `mat3 normalMatrix = mat3(transpose(inverse(model)));`. Am I right? The specular lighting however now seems to react strange to camera movement. – Joey Dewd Aug 15 '13 at 17:45
  • Wrong: camera-space is not world-space. Camera-space is world space multiplied by camera view matrix. It is not a good place to perform calculations, because I can hardly imagine vectors, that are in that space initially. Use world-space. – Celestis Aug 15 '13 at 17:53
  • I know what camera space is :) I meant that the light coordinates were supplied as world space coordinates and then transformed to camera space via the viewing matrix. I will try to do all the calculations in model space instead of camera/eye-space and see what will happen :) – Joey Dewd Aug 15 '13 at 17:59
  • I am now calculating everything in world space and added the camera position (playerPos) as a uniform in world space to aid in my calculations. The only place I should use the playerPos is at `vector = normalize(playerPos - Position); TangentSurface2View = mat * vector;`. However, it's still not working properly (same lighting bugs), I'll edit the shader code to the new code, maybe it will shed some insight into why the lighting bugs. – Joey Dewd Aug 15 '13 at 18:08