[Public WebGL] glReadPixels (to buffer object) and endianness

Kevin Rogovin ([email protected]) [email protected]
Mon Jul 20 06:39:07 PDT 2020

Hi all,

 This is a question on what is the guaranteed behaviour related to an
endianness. The situation: I plan to generate a LARGE index buffer and
using transform feedback is not suitable.  The plan is to essentially
rasterize the indices. For desktop GL, this is easy, the render target
would be GL_R32UI and glReadPixels would be passed GL_RED_INTEGER with
GL_UNSIGNED_INT. However, GLES3 and this WebGL2 do not allow that
combo in glReadPixels(); indeed for reading from such a buffer would
require GL_RGBA_INTEGER which would make one want to read the index
buffer with a stride of 4. What I'd like to do is to rasterize an
GL_RGBA8 fixed point buffer, do the right thing to convert each 8-bit
chunk into a vec4 tuple and then call glReadPixels with GL_RGBA,
GL_UNSIGNED_BYTE.  (Note that because of the rules associated to
GL_ELEMENT_ARRAY_BUFFER, the glReadPixels will write to a staging
buffer which is then copied to the index buffer). The question is:
will the "bit-casting" of the RGBA8-tuple data to GL_UNSIGNED_INT be
platform independent? I would really like to avoid the idea of reading
the 32-bit values as (GL_RGBA_INTEGER, RL_UNSIGNED_BYTE) and issuing
transform feedback as that adds a lot more data  copy and bandwidth.

Best Regards,
 -Kevin Rogovin

You are currently subscribed to [email protected]
To unsubscribe, send an email to [email protected] with
the following command in the body of your email:
unsubscribe public_webgl

More information about the public_webgl mailing list