[Public WebGL] Why does the set of pname arguments for which the behavior of getShaderParameter is defined not match GL ES?

Boris Zbarsky [email protected]
Tue Apr 17 21:20:24 PDT 2012


On 4/18/12 12:16 AM, Glenn Maynard wrote:
> It could be specified in each function that takes an enum argument, eg.
> replace the getShaderParameter definition with something like:
>
> "If /pname/ is not present in the following table, generate
> GL_INVALID_ENUM.  Otherwise, return the value for /pname/ given
> /shader/, using the specified type."

I think this would be most user-friendly, yes.

> Not exactly: to test this fully, you'd need to test every value which
> isn't explicitly supported.

Hmm... Yeah, fair.  Good catch on extensions.

> It's probably reasonable to test every unsupported [0, 0xFFFF] value,
> though, for every function that takes an enum.

Yeah, probably.  65000 is just not that big a number nowadays.  ;)

-Boris

-----------------------------------------------------------
You are currently subscribed to [email protected]
To unsubscribe, send an email to [email protected] with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------





More information about the public_webgl mailing list