Java opengl : glDrawElements() with >32767 vertices

Java opengl : glDrawElements() with >32767 vertices … here is a solution to the problem.

Java opengl : glDrawElements() with >32767 vertices

I have a complex model with >32767 vertices. Now, indexes can only be passed to OpenGL as GL_UNSIGNED_BYTE or GL_UNSIGNED_SHORT types. Java has no concept of unsigned, so the unsigned short option maps to a simple (signed) short, which is 16 bits, or +32767. When I specify a vertex, I need to pass a short[] to opengl, where the value in the array points to a vertex in the vertex array. However, if there are >32767 vertices, the value will not fit short[].

Is there any other way to specify the index? The code fragment is as follows

    short[] shorts = ... read the indices ...;
    ...
    ShortBuffer indicesBuffer = null;
    ByteBuffer ibb = ByteBuffer.allocateDirect(indices.length * Short.SIZE / 8);
    ibb.order(ByteOrder.nativeOrder());
    indicesBuffer = ibb.asShortBuffer();
    indicesBuffer.put(indices);
    indicesBuffer.position(0);
    ...
    gl.glDrawElements(GL10.GL_TRIANGLES, numOfIndices, GL10.GL_UNSIGNED_SHORT, indicesBuffer);
    ...

Solution

I’m not using OpenGL in Java, so I’m speculating here, but there’s a good chance you’re using only negative numbers, the binary representation of which is the same as the unsigned positive numbers you really want. You give GL some byte pairs and tell it to interpret them as unsigned, as long as they have the correct values when interpreted that way, it should work. It doesn’t matter if Java thinks they mean something different when storing these bits in memory.

If you’re iterating, just ignore the wrap and continue incrementing. When you reach -1, you’re done.

If you calculate the index number as an integer (which does not have this range problem) and then convert to short, subtract 65536 from any number greater than 32767.

Related Problems and Solutions