[Public WebGL] Rendering to HDR displays (10bits per color component)

Florian Bösch [email protected]
Thu Jul 12 04:16:00 PDT 2018


On Thu, Jul 12, 2018 at 12:33 PM, Javi Agenjo <[email protected]> wrote:

> but if Chrome supports HDR video rendering (as far as they say), there has
> to be some sort of pipeline going on outputing to 10bits, unless it is
> all happenning beyond the pipeline through some sort of decoding chip
> inside the GPU.
>

My guess is it's a feature of the hardware accelerated video decoder.


> Im asking because Im working in an European project related to HDR (HDR4EU
> <https://www.upf.edu/web/hdr4eu>) and there are companies pushing HDR
> displays for consumers so there are reasons to expect changes in the near
> future, with better quality and gamuts. So it would be interesting to see
> some suggestions about how browsers can adapt to that change in the next
> years.
>

I would absolutely love HDR capability trough the pipeline. The 8-bit per
channel convention is ridiculous nowadays because the actual display
hardware (especially in OLED displays) is capable of many more graduations
(even if the decoder chips is in the monitor aren't). Linear color space
floating point rendering is becoming the norm, only for the result to be
squashed together into a gamma/8-bit channel. It's nuts.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://khronos.org/pipermail/public_webgl_khronos.org/attachments/20180712/36d4c158/attachment.html>


More information about the public_webgl mailing list