[Public WebGL] gl.enable(gl.TEXTURE_2D)

Kenneth Russell [email protected]
Wed Jan 20 15:48:09 PST 2010

On Tue, Jan 19, 2010 at 8:37 AM, Giles Thomas <[email protected]> wrote:
> Hi all,
> A quick follow-up question:
> 2010/1/13 Kenneth Russell <[email protected]>
>> TEXTURE_2D is not a valid enable bit in OpenGL ES 2.0 or, consequently,
>> WebGL. The enum was in the EnableCap section only because it was there in
>> the OpenGL ES 2.0 headers. I've moved its definition to the TextureTarget
>> section.
> Have there been versions of Chrome (or perhaps WebKit?) where
> gl.enable(gl.TEXTURE_2D) was erroneously required to enable textures?  A
> reader of my blog reports that Chrome 4.0.295 seems to need it with a
> specific OS/graphics card combination -- details here:
> http://learningwebgl.com/blog/?p=684&cpage=1#comment-787

If there is a sample which only works with that call in place, then it
isn't Chrome or WebKit imposing that requirement but the OpenGL
driver. It should not be necessary to enable the TEXTURE_2D bit on a
particular texture unit in order to sample it in a shader. You should
report this issue to the graphics card vendor.

By the way, because the code is evolving quickly, at this point I only
recommend using WebGL inside of the latest Chromium builds. See
http://khronos.org/webgl/wiki/Getting_a_WebGL_Implementation for
instructions on downloading and running them.


You are currently subscribe to [email protected]
To unsubscribe, send an email to [email protected] with
the following command in the body of your email:

More information about the public_webgl mailing list