[Public WebGL] WEBGL_debug_shader_precision extension proposal

Kenneth Russell [email protected]
Fri Nov 7 16:50:14 PST 2014

On Thu, Nov 6, 2014 at 6:05 AM, Florian Bösch <[email protected]> wrote:
> On Thu, Nov 6, 2014 at 2:56 PM, Olli Etuaho <[email protected]> wrote:
>> WEBGL_shader_ast is a neat idea, but that would require a large spec if it was a WebGL extension, and it would add possibly unwanted constraints on how the parsing infrastructure in browsers should work.
> Maybe vendor folks could chime in here. I'd really like something like it, how feasible is it?

It's infeasible. Basically you'd have to cause JavaScript to run
during the (synchronous) shader compilation step. Even if the data
types could all be defined at the JavaScript level and serialized up
from the shader compiler, the upcall to JavaScript would break all
sorts of invariants in Chrome, at least, if not also other browsers.

On Thu, Nov 6, 2014 at 6:30 AM, Olli Etuaho <[email protected]> wrote:
> My only misgiving on using #extension rather than #pragma is that it seems
> incompatible with the idea that only calling getExtension() would be enough
> to enable the functionality. If #extension was used, I suppose it would mean
> that adding #extension WEBGL_debug_shader_precision : enable; would also be
> required in all shaders to enable the emulation. But maybe that would be
> better for consistency.

For consistency with other #pragmas, the #pragma should actually be
something like:

#pragma webgl_debug_shader_precision(off)

and it would be on by default if the WEBGL_debug_shader_precision
extension were enabled.

No strong feelings either way. It does seem better if the shaders
don't have to be modified in order to take advantage of this


You are currently subscribed to [email protected]
To unsubscribe, send an email to [email protected] with
the following command in the body of your email:
unsubscribe public_webgl

More information about the public_webgl mailing list