8

I am rendering a point based terrain from loaded heightmap data - but the points change their texturing depending on where the camera position is. To demonstrate the bug (and the fact that this isnt occuring from a z-buffering problem) I have taken screenshots with the points rendered at a fixed 5 pixel size from very slightly different camera positions (same angle), shown bellow:

PS: The images are large enough if you drag them into a new tab, didn't realise stack would scale them down this much.

State 1: State 1 image

State 2: State 2 image

The code to generate points is relatively simple so I'm posting this merely to rule out the option - mapArray is a single dimensional float array and copied to a VBO:

for(j = 0; j < mHeight; j++)
{
    for(i = 0; i < mWidth; i++)
    {
        height = bitmapImage[k];

        mapArray[k++] = 5 * i;
        mapArray[k++] = height;
        mapArray[k++] = 5 * j;
    }
}

I find it more likely that I need to adjust my fragment shader because I'm not great with shaders- although I'm unsure where I could have gone wrong with such simple code and guess it's probably just not fit for purpose (with point based rendering). Bellow is my frag shader:

in varying vec2 TexCoordA;
uniform sampler2D myTextureSampler;

void main(){
    gl_FragColor = texture2D(myTextureSampler, TexCoordA.st) * gl_Color;
}

Edit (requested info): OpenGL version 4.4 no texture flags used.

TexCoordA is passed into the shader directly from my Vertex shader with no alterations at all. Self calculated UV's using this:

float* UVs = new float[mNumberPoints * 2];
    k = 0; 
    for(j = 0; j < mHeight; j++)
    {
        for(i = 0; i < mWidth; i++)
        {
            UVs[k++] = (1.0f/(float)mWidth) * i;
            UVs[k++] = (1.0f/(float)mHeight) * j;
        }
    }
jProg2015
  • 1,028
  • 8
  • 37
  • 1
    Aside from the warped aspect ratio, what exactly is supposed to be different between the two images? – Andon M. Coleman Jan 27 '14 at 23:43
  • If you look in the highlighted red circle area; some of points in one image change their colour compared to the second image - directly taken from a texture. UV's dont change, point position doesn't change. The only thing that move's is the camera- and it seems to affect the texture sampliing. – jProg2015 Jan 27 '14 at 23:45
  • Well, you are using point primitives. If you were actually filling this with triangles the shift would not be noticeable. It is perfectly normal, just a consequence of how points are rasterized. You might be able to lessen the appearance if you stopped using smoothed points or increased the point size... – Andon M. Coleman Jan 27 '14 at 23:51
  • I feel like I'm missing something obvious but; why is the shift occuring? The UV coords don't change (and in the example above nor does the size of the point) so why should it be sampling a different colour from the texture? – jProg2015 Jan 27 '14 at 23:53
  • 1
    It is occurring because the camera is moving. Consider the (bottom) diagram [here](https://www.opengl.org/documentation/specs/version1.1/glspec1.1/node44.html#SECTION00630000000000000000) and what happens when you shift the position in sub-pixel increments. If your coverage area included the entire pixel, this shift would not be nearly as noticeable. – Andon M. Coleman Jan 27 '14 at 23:56
  • I see. This article suggests ST are adjusted by Q (Q?) - is there a way to access Q from within my shader and sample the intended area? I understand it wishes to move the point to be aligned to the screen pixels but is there no way to stop it from also adjusting the texture coords? – jProg2015 Jan 28 '14 at 00:09
  • 2
    Perspective texture correction (division by Q) occurs automatically. That is not even the issue here, to be honest. The issue is that the coverage area in that diagram will shift by 1 whole pixel if you move the vertex that generated the point even slightly upwards in screen-space (the bottom X at (2.5, 0.5) will drop out). The center might not cross a pixel boundary, but the coverage area will. – Andon M. Coleman Jan 28 '14 at 00:20
  • I guess I have two options, work out the colour myself and pass it into the shader in another buffer- or use extremely high resolution heightmaps and LOD functionality and hope it isnt noticeable. - Maybe both. Once again @AndonM.Coleman - many thanks! – jProg2015 Jan 28 '14 at 00:40
  • You might be able to use a `flat` interpolation qualifier for your texture coordinates (`flat varying vec2 TexCoordA`) to use the same texture coordinate at every pixel in your rasterized point in the fragment shader. If you combine that with some function to snap the coordinate to an integer location you just might get some more consistent behavior. `flat` interpolation is only supported by GLSL 1.40 / GPU_shader_4 though, so a function to snap the coordinates to an integer may be necessary. – Andon M. Coleman Jan 28 '14 at 00:59
  • 1
    @Tom Burbeck: can you please update the question with: the vertex shader (i'm interested in TexCoordA calculus), the loading of the texture assigned to myTextureSampler uniform and the sampling flags used for that texture(if none used please let me know the opengl version) – Raxvan Jan 31 '14 at 18:37
  • A value which is calculated from some constants and "TexCoordA" changes depending on camera position. My conclusion is that your TexCoordA changes depending on camera position. It would kind of help if you showed us where that came from... – Andy Newman Jan 31 '14 at 22:22
  • Try using a sampler object with the MIN/MAG filters set to GL_NEAREST – Sebastian Cabot Feb 01 '14 at 15:07
  • @Tom Burbeck: i was interested in the way you are loading the texture because i suspect the problem is because of mipmapping, with nearest mipmapping selection. This effect is created right at the edge where the pixels from the next mipmaps are selcted and because the mipmap is half the size thus half the information is stored for each color, some pixels right at the edge of your color can be selected from the other color. Try to find out if you are using mipmapping and remove it to confirm if this is the problem or not. – Raxvan Feb 03 '14 at 13:50
  • I suspect that it's z fighting that cause the problem. May be the triangles you generated has some duplication or something. Could you please show me the code you use to generate the triangle index array? Did you generate it on CPU or using geometry shader? – Wood Feb 20 '14 at 06:52
  • I'm not generating and triangles they're being rendered as points. – jProg2015 Feb 20 '14 at 10:06
  • Can you even set UV coordinates to points? I'd retry the same experiment assigning colors to points and see what happens. If colors are still changing then you know it's not the texture mapping. – memecs Feb 20 '14 at 17:24
  • Any sort of lighting is disabled right? – memecs Feb 20 '14 at 17:27
  • Lights disabled. Colours remain the same - so its definitely the texture mapping. – jProg2015 Feb 20 '14 at 20:00
  • Are you doing anything in the shader that takes the camera angle into consideration? Probably not. So this is actually not a problem. Can we close this out now? Trust me video cards work. Honestly I dont see what the problem is to begin with. The two images shown are identical except for the camera angle. – Dan Feb 21 '14 at 04:23
  • The two images are not identical and I assure you I have a problem here. My colleagues are also stumped on this. What camera calculations do you suggest for correcting texture coordinates exactly? I found nothing online because most people tend to use polygons not points. @Dan instead of being rude and implying my question is stupid, and especially that the solution is easy - maybe you could suggest an answer and I would mark it correct. – jProg2015 Feb 21 '14 at 18:20
  • @Tom. Im not being rude. You were told what the reason for the difference is and you still remain "stumped". If you give the shader uv's and the shader passes them to the sampler. then you get the color you sampled. It does not vary. Your interpolating accross the texture and the point at which you advance from 1 texel to the next is the only thing that changes. This is how rasterization works. Its not broken. You simply choose to remain "Stumped". Being rude is what your doing by rejecting the obvious and expecting people to throw more time at a non problem. – Dan Feb 22 '14 at 05:31
  • I didn't say the behaviour was non expected I accept the reason for it happening but that isn't a solution for getting desired behavior. – jProg2015 Feb 22 '14 at 08:44
  • What precision do you use in fragment shader? Does this occurs on other graphic cards? – Teivaz Feb 27 '14 at 00:32
  • Try to zoom in and see if the problem persists – Ahmed U3 Feb 28 '14 at 17:09
  • You need to turn off Smoothing, turn of antialiasing, and then understand that open GL only supports a point size of 1. Any other "support" comes from the hardware vendor in the form of drivers. When the "point" is rasterized it is interpolated from the start of the point to the end of the point for a given scan line. And those values come from some place. The ratio of on screen pixels to texel's changes with respect to camera position. You believe that the values are passed directly from the vertex shader to the pixel shader but thats not the case. Run it on another gpu and see wyg. – Dan Mar 17 '14 at 07:03
  • have you tried disabling point smooth? (already asked, sorry....) – j-p Mar 21 '14 at 11:56

2 Answers2

0

Just a guess: How are your point rendering parameters set? Perhaps the distance attenuation (GL_POINT_DISTANCE_ATTENUATION ) along with GL_POINT_SIZE_MIN and GL_POINT_SIZE_MAX are causing different fragment sizes depending on camera position. On the other hand I think I remember that when using a vertex shader these functionality is disabled and the vertex shader must decide about the size. I did it once by using

//point size calculation based on z-value as done by distance attenuation
float psFactor = sqrt( 1.0 / (pointParam[0] + pointParam[1] * camDist + pointParam[2] * camDist * camDist) );
gl_PointSize   = pointParam[3] * psFactor; 

where pointParam holds the three coefficients and the min point size:

uniform vec4 pointParam;   // parameters for point size calculation [a b c min]

You may play around by setting your point size in the vertex shader directly with gl_PointSize = [value].

cweigel
  • 1,159
  • 13
  • 18
0

This looks just like a subpixel accurate texture mapping side-effect. The problem with texture mapping implementation is that it needs to interpolate the texture coordinates on the actual rasterized pixels (fragments). When your camera is moving, the roundoff error from real position to the integer pixel position affects texture mapping, and is normally required for jitter-free animation (otherwise all the textures would jump by seemingly random subpixel amounts as the camera moves. There was a great tutorial on this topic by Paul Nettle.

You can try to fix this by not sampling texel corners but trying to sample texel centers (add half size of the texel to your point texture coordinates).

Another thing you can try is to compensate for the subpixel accurate rendering by calculating the difference between the rasterized integer coordinate (which you need to calculate yourself in a shader) and the real position. That could be enough to make the sampled texels more stable.

Finally, size matters. If your texture is large, the errors in the interpolation of the finite-precision texture coordinates can introduce these kinds of artifacts. Why not use GL_TEXTURE_2D_ARRAY with a separate layer for each color tile? You could also clamp the S and T texcoords to edge of the texture to avoid this more elegantly.

the swine
  • 9,873
  • 6
  • 50
  • 94