[Public WebGL] Why does the set of pname arguments for which the behavior of getShaderParameter is defined not match GL ES?
Wed Apr 18 19:07:24 PDT 2012
On Wed, Apr 18, 2012 at 8:40 PM, Gregg Tavares (勤) <[email protected]> wrote:
> Um, no, this would not be reasonable IMO. I think you'll find that some
> browsers have a lot of overhead with errors from logging them to help
> developers find their errors to propagating them through multiple
> subsystems and that checking 65536 values will be far too slow, especially
> for every function that takes an enum. Some take 2-5 enums. Testing every
> combination will likely be too slow.
Firefox's console is catastrophically broken; on my system, it takes about
4ms *per log*. That's purely a bug in Firefox that needs to be fixed.
(Frankly, I don't know how they rationalized shipping a feature in that
condition--it's too slow to even use during development, much of the time.
Chrome's console is reasonable, logging 64k values in about three
seconds.) Just close the console while you run tests until this is fixed.
(The tests can be moved to an isolated unit test, of course, perhaps with a
sanity check to abort if it takes far too long, to avoid hosing browsers if
people forget and leave the console open.)
(I'm in Firefox 9; this may well be fixed in the current version, though I
wouldn't put my own money on it.)
With the console closed, Firefox runs a 65536-enum test in 500ms; Chrome
does it in 3ms. It's completely reasonable to test [0,0xFFFF] for all enum
parameters. This will help ensure, in the general case (or as close to it
as we can get), that no unexpected values are unintentionally supported,
such as due to unknown vendor-specific ES extensions leaking through.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the public_webgl