[Public WebGL] gamma/color profile and sRGB

Florian Bösch [email protected]
Wed Nov 19 22:40:58 PST 2014


A frequent problem I encounter when rendering things in webgl, is that it's
difficult to produce a scene that looks approximately the same on a wide
variety of displays.

I identify several reasons for this:

   - The users brightness and contrast setting
   - The displays color response curves
   - The operating systems color profile (if any)

I don't think there's much to do about brightness and contrast settings
other than to make sure to test it at medium settings. Likewise for the
operating systems color profile specifically, there isn't much to do,
because the assumption is that whatever it does, it combines with the
monitors response curve to provide a better picture.

However this leads us to the displays color response curve, which is a
product of the hardware itself, manufacturing deviations and the operating
systems color profile setting. Some operating systems allow users to change
the color profile from a default (but most don't) and other systems (like
mobiles) usually come with a color calibration profile that's been factory
set and cannot be changed, it's intended to account for manufacturing
differences, but it often isn't perfect as it's done manually by far and
large.

Ideally it would be sufficient to apply a simple gamma correction for
output (gl_FragColor.rgb = pow(color, 1.0/2.2), a method advocated by this
GPU gem: http://http.developer.nvidia.com/GPUGems3/gpugems3_ch24.html

But this isn't exactly sRGB, and most consumer display hardware is intended
to display sRGB/Rec.709. I'm currently using an alternative function, but
I'm not convinced it's any better. This is based on the sRGB standard (
http://www.color.org/srgb.pdf),

A convenient way to sidestep the problem of producing output that conforms
to sRGB would be to render to a sRGB output target. This is currently not
possible. WebGL 1.0 does not contain any sRGB support. EXT_sRGB allows to
render to an sRGB render target, but it's my understanding that if you'd
get to the final blit to screen, the shader would read out the linear
values from the sRGB texture, and put them on screen, so no improvement
there. WebGL 2.0 decouples the context and drawing buffer somewhat, however
neither its context attributes nor its drawing buffer interface allow for
specifying the drawing buffer as a sRGB surface. On any account, it's
unclear that even if you would render into an sRGB drawing buffer in WebGL,
the right thing would happen, as the browser compositor is likely to just
read out that texture and composit the linear value with the rest of the
page.

I have recently done some work with GLFW, and it has a function called
getGammaRamp (for a given display). It returns an array of values for each
channel, that represent the monitors color response curves. These can be
used in a shader as a texture for conversion.

This way of handling color response curves strikes me as rather flexible,
because it's not tied to a particular standard, but can represent any
conceivable standard.

My question is, do you think something like getGammaRamp would make sense
for inclusion into WebGL 2.0?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://khronos.org/pipermail/public_webgl_khronos.org/attachments/20141120/7a20028d/attachment.html>


More information about the public_webgl mailing list