[Public WebGL] WEBGL_debug_shader_precision extension proposal

Florian Bösch [email protected]
Thu Nov 6 06:35:57 PST 2014


I don't have a preference either way, all my shaders go trough a bit of
utility code that tacks stuff on or replaces some bits. Same probably for
most users of any kind of framework.

On Thu, Nov 6, 2014 at 3:30 PM, Olli Etuaho <[email protected]> wrote:

>  My only misgiving on using #extension rather than #pragma is that it
> seems incompatible with the idea that only calling getExtension() would be
> enough to enable the functionality. If #extension was used, I suppose it
> would mean that adding #extension WEBGL_debug_shader_precision : enable;
> would also be required in all shaders to enable the emulation. But maybe
> that would be better for consistency.
>  ------------------------------
> *From:* Florian Bösch <[email protected]>
> *Sent:* Thursday, November 6, 2014 4:05 PM
> *To:* Olli Etuaho
> *Cc:* [email protected]
> *Subject:* Re: [Public WebGL] WEBGL_debug_shader_precision extension
> proposal
>
>   On Thu, Nov 6, 2014 at 2:56 PM, Olli Etuaho <[email protected]> wrote:
>
>>  WEBGL_shader_ast is a neat idea, but that would require a large spec if
>> it was a WebGL extension, and it would add possibly unwanted constraints on
>> how the parsing infrastructure in browsers should work.
>>
> Maybe vendor folks could chime in here. I'd really like something like it,
> how feasible is it?
>
>
>>  Implementing that as a JS library might actually be less effort. In
>> this case the additions to the API are minimal. Also, one important goal
>> here is to make using the emulation as easy as possible. To that end, just
>> a few added lines of code is much better than integrating a big library to
>> a JS app
>>
> That's true, but it's only true if you don't have WEBGL_shader_ast :). If
> you had it, it'd be a few dozen lines of JS you could just drop in and have
> it modify the WebGLContext.prototype.shaderSource.
>
>
>>  I come up with an alternative on how toggling the emulation on a
>> shader-by-shader basis should work, by the way - using a "#pragma
>> webgl_disable_precision_emulation;" directive in shaders could be simpler
>> to both understand and implement. Thoughts on this?
>>
>
>  Since it is an extension how about "#extension
> WEBGL_debug_shader_precision : disable" ?
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://khronos.org/pipermail/public_webgl_khronos.org/attachments/20141106/74159793/attachment.html>


More information about the public_webgl mailing list