7

I got a segfault when I tried to load a 771x768 image.

Tried with a 24x24 and 768x768 image and they worked, no problem.

Is this expected? Why wouldn't it just fail gracefully with a GL Error?

The segmentation fault occurs in the glTexImage2D call. I am loading a PPM binary file so it is packed 24 bits per pixel. This odd number combined with an odd dimension probably produces a not-4-byte (or even 2-byte) aligned structure (and referencing outside of my exactly enough allocated buffer may be the cause of the error but gdb does not show me a memory address (which I could use to find out if this is what causes it)).

glTexImage2D(GL_TEXTURE_2D, 0, 3, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, dataptr);
// in this specific case of failure, width = 771, height = 768,
// dataptr contains 1776384 bytes of binary RGB image data (771*768*3 = 1776384)
Steven Lu
  • 36,733
  • 50
  • 179
  • 328
  • 2
    Where does the segfault occur? What does your code look like? – Goz Sep 11 '11 at 19:47
  • updated question with some details. I think I have an idea about what's gone wrong. Need to test more. – Steven Lu Sep 11 '11 at 19:51
  • To try your theory about it referencing memory outside what is allocated, can you allocate something like 16 bytes extra and see if the error still occurs? – Ville Krumlinde Sep 11 '11 at 19:53
  • @Ville, a most excellent suggestion! It does not seem to fix it if I allocate a bit more space. I will test different resolutions. If it works for all even ones (or all power of two, even) I can't really expect more. It's just the segfault that's strange. – Steven Lu Sep 11 '11 at 19:55
  • I would suspect the non-square nature of the image as being the cause of the problem. – ChrisF Sep 11 '11 at 20:01
  • Another possibility is that the PPM file reader you use is buggy on uneven sizes. – Ville Krumlinde Sep 11 '11 at 21:21
  • I implemented the PPM reader this afternoon, actually. Turns out it's rock solid :) So in the end, the answer is OpenGL will assume the scanlines of the texture are 4-byte aligned, so I think it will be skipping one byte (or more) every line, I had over 700 lines. Maybe if I had allocated 1KB extra it would prevent the crash. – Steven Lu Sep 11 '11 at 22:50

2 Answers2

16

This odd number combined with an odd dimension probably produces a not-4-byte (or even 2-byte) aligned structure (and referencing outside of my exactly enough allocated buffer may be the cause of the error

This is likely the cause. Luckily you can set the alignment OpenGL uses reading pixel data. Right before calling glTexImage…(…) do

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
glPixelStorei(GL_UNPACK_SKIP_PIXELS, 0);
glPixelStorei(GL_UNPACK_SKIP_ROWS, 0);
datenwolf
  • 149,702
  • 12
  • 167
  • 273
  • Is there a link where I could read more about the 4/2-byte aligned structure? – Benedikt Bock Mar 14 '15 at 18:37
  • 1
    @BenediktBock: There's not really much to it. n-alignment just means that the start of a particular set of data (a row of pixels, a pixel itself) happens so start at a memory address that's a integer multiple of n. Most CPU architectures have certain alignment restrictions one must follow; for example 32 bit ARM wants everything to be aligned at 16 bit = 2 byte or 32 bit = 4 byte boundaries (depending on which ARM architecture version). Intel CPUs can do atomic operations only at 4 byte alignment. So you'll find structure formats that align data to some n = 2, 4 or 8 to make things simpler. – datenwolf Mar 14 '15 at 19:30
  • @BenediktBock https://www.khronos.org/opengl/wiki/Common_Mistakes#Texture_upload_and_pixel_reads – hacksoi Mar 30 '19 at 21:25
1

I've read this in the opengl forums:

width must be 2^m + 2(border) for some integer m.
height must be 2^n + 2(border) for some integer n. 

(source)

I found this which I believe it clarifies what's happening:

1. What should this extension be called?

  STATUS: RESOLVED

  RESOLUTION:  ARB_texture_non_power_of_two.  Conventional OpenGL
  textures are restricted to size dimensions that are powers of two.

from GL_ARB_texture_non_power_of_two

Pablo Ariel
  • 268
  • 3
  • 15
  • This is relevant for OpenGL-1.x only. Since OpenGL-2.x arbitrary texture sizes are allowed. – datenwolf Sep 11 '11 at 20:42
  • Since I solved the crashing issue I wrote a little test that iterates through all permutations of pairs of integers from 64 to 1024, loading a texture with those dimensions. (e.g. 64x64, 64x65 ... 64x1024, 65x64, ...), So far it is almost at halfway through the 921,600 possible resolutions in this set. No crashes, and no segfaults yet. So, yes, the implementation of OpenGL I am running appears to accomodate arbitrary texture sizes, and I can vouch for it because I've loaded over 500 thousand different ones. – Steven Lu Sep 11 '11 at 22:45