[Public WebGL] Why does the set of pname arguments for which the behavior of getShaderParameter is defined not match GL ES?

Gregg Tavares (勤) [email protected]
Wed Apr 18 18:40:09 PDT 2012


On Tue, Apr 17, 2012 at 9:20 PM, Boris Zbarsky <[email protected]> wrote:

>
> On 4/18/12 12:16 AM, Glenn Maynard wrote:
>
>> It could be specified in each function that takes an enum argument, eg.
>> replace the getShaderParameter definition with something like:
>>
>> "If /pname/ is not present in the following table, generate
>> GL_INVALID_ENUM.  Otherwise, return the value for /pname/ given
>> /shader/, using the specified type."
>>
>
> I think this would be most user-friendly, yes.
>
>
>  Not exactly: to test this fully, you'd need to test every value which
>> isn't explicitly supported.
>>
>
> Hmm... Yeah, fair.  Good catch on extensions.
>
>
>  It's probably reasonable to test every unsupported [0, 0xFFFF] value,
>> though, for every function that takes an enum.
>>
>
Um, no, this would not be reasonable IMO. I think you'll find that some
browsers have a lot of overhead with errors from logging them to help
developers find their errors to propagating them through multiple
subsystems and that checking 65536 values will be far too slow, especially
for every function that takes an enum. Some take 2-5 enums. Testing every
combination will likely be too slow.


>
> Yeah, probably.  65000 is just not that big a number nowadays.  ;)
>
>
> -Boris
>
> ------------------------------**-----------------------------
> You are currently subscribed to [email protected]
> To unsubscribe, send an email to [email protected] with
> the following command in the body of your email:
> unsubscribe public_webgl
> ------------------------------**-----------------------------
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://khronos.org/pipermail/public_webgl_khronos.org/attachments/20120418/450bdacf/attachment.html>


More information about the public_webgl mailing list