From kbr...@ Thu Sep 2 19:01:36 2010 From: kbr...@ (Kenneth Russell) Date: Thu, 2 Sep 2010 19:01:36 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D Message-ID: In the WebGL specification we've added pixelStorei parameters UNPACK_FLIP_Y_WEBGL and UNPACK_PREMULTIPLY_ALPHA_WEBGL to handle common use cases when uploading images from the browser via texImage2D and texSubImage2D. It appears that we are going to need another one to determine whether or not gamma correction needs to be applied during the image upload. When displaying ordinary images on a web page, the browser always performs gamma correction on the image data. WebGL applications that want to encode non-image data in images definitely do not want gamma correction to be applied; they want the verbatim pixel data uploaded. For the display of random web images in WebGL applications it is less clear. I don't know whether most 3D applications would want to have the gamma-corrected or non-gamma-corrected pixels uploaded, and in particular any interactions with shading. Questions: 1. What should the name of the new (boolean) pixelStorei parameter be? The name which would most closely match the other parameters would probably be UNPACK_CORRECT_GAMMA_WEBGL, where "correct" is a verb. However, this name is probably confusing (why would you ever want "incorrect" gamma?). UNPACK_PERFORM_GAMMA_CORRECTION_WEBGL? 2. What should the default value of this flag be? If it were false, then for images uploaded from the browser to WebGL, the default behavior would be for the pixels to be completely untouched. However, this might be surprising behavior to applications displaying images on screen with very simple shader code (no lighting) and expecting them to look the same as surrounding browser content. Still, I am inclined to suggest that the default for the new flag be false to match most other OpenGL behavior, where anything other than a pass-through of data is optional and disabled by default. -Ken ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ced...@ Thu Sep 2 19:34:37 2010 From: ced...@ (Cedric Vivier) Date: Fri, 3 Sep 2010 10:34:37 +0800 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: Message-ID: On Fri, Sep 3, 2010 at 10:01, Kenneth Russell wrote: > Questions: > > 1. What should the name of the new (boolean) pixelStorei parameter be? > The name which would most closely match the other parameters would > probably be UNPACK_CORRECT_GAMMA_WEBGL, where "correct" is a verb. > However, this name is probably confusing (why would you ever want > "incorrect" gamma?). UNPACK_PERFORM_GAMMA_CORRECTION_WEBGL? > > The latter certainly sounds less confusing. > 2. What should the default value of this flag be? If it were false, > then for images uploaded from the browser to WebGL, the default > behavior would be for the pixels to be completely untouched. However, > this might be surprising behavior to applications displaying images on > screen with very simple shader code (no lighting) and expecting them > to look the same as surrounding browser content. > IMHO this use case would only be likely with WebGL-based image editing, in most other applications (games, object viewers, etc) the final pixels might be too transformed through perspective, filtering, mipmapping, lighting, normal maps, light maps and so on, for slight gamma correction to really matter, so default should be false for least surprise when using non-image data. However how does the browser typically handle gamma correction? Does it perform it depending on image metadata? Display color profile? A mixture or both? Regards, -------------- next part -------------- An HTML attachment was scrubbed... URL: From ste...@ Thu Sep 2 20:51:40 2010 From: ste...@ (Steve Baker) Date: Thu, 02 Sep 2010 22:51:40 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: Message-ID: <4C8070CC.9030702@sjbaker.org> Gamma correction is a tricky business. For typical 3D applications you want your images with NO gamma correction. That's because you're going to go on to perform lighting, fogging and a number of other operations on the texture before it's displayed. The equation for gamma is Vout = pow ( Vin, gamma ) ; ...where gamma is around 1/2.2 for CRT's and CRT-emulators such as LCD's and plasma displays. It is clearly the case that in general: pow ( Vin * light, gamma ) != pow ( Vin, gamma ) * light. So gamma-correcting the input is in no way a substitute for gamma correcting on the output. Put non-mathematically - the main thing that gamma correction does is to increase the contrast in dim areas and reduce it in bright areas as a better match for the non-linearities inherent in CRT's. Gamma-correcting the input to the renderer can do nothing to increase the brightness in areas where it is dark because there is only a little light being cast. So you still get overly dark areas in the resulting rendering - even though you pre-gamma'd the texture. You could (in principle) build a really complicated lighting and fogging algorithm that applied light in a non-linear way to preserve the gamma correction...but the math is ugly and it has to be done in the fragment shader. However, there are other things going on in the graphics pipeline such as magnification and minification, antialiassing, alpha blending and compositing - all of which are inherently linear operations over which we have no software control whatever. The RIGHT thing is therefore to provide linear textures, to do your rendering in linear domain - and then to apply gamma correction to the FINAL image. Any other way of doing it is mathematically wrong - and looks noticably nasty. Hence, the default should be (as it is with OpenGL) which to NOT mess with the texel data...at least not by default. There are actually distinct three cases to consider: 1) Your source texture came from a camera or something else that applies gamma correction before the image is saved. In this case, you need to apply reverse gamma-correction to that image in an effort to get a linear texture - then do your lighting - then to gamma-correct the final rendering. 2) Your source texture is already in linear space - you'll do lighting in linear space - and then you'll need to do gamma corrections on the final rendering. 3) You are doing no lighting/blending/mipmapping/fog/etc and (for some reason) you have also chosen not to do gamma correction at the end. In that case and ONLY in that case, you should gamma-correct your textures on input. I maintain that very few WebGL applications will do (3). IMHO, the option should be to direct the browser's compositor to apply a gamma-correcting shader as it does final image composition with the output of the application's canvas. That way everything prior to that is in linear color space where life is easy. Doing inverse gamma-correction of images that have somehow been gamma-corrected already (JPEG's mostly) or for things grabbed off-screen that have already been gamma-corrected once is perhaps defensible as "the best you can do under trying circumstances" in case #1, above - but we shouldn't design the system to do that by default because going into and out of gamma produces lots of roundoff error. -- Steve Cedric Vivier wrote: > On Fri, Sep 3, 2010 at 10:01, Kenneth Russell > wrote: > > Questions: > > 1. What should the name of the new (boolean) pixelStorei parameter be? > The name which would most closely match the other parameters would > probably be UNPACK_CORRECT_GAMMA_WEBGL, where "correct" is a verb. > However, this name is probably confusing (why would you ever want > "incorrect" gamma?). UNPACK_PERFORM_GAMMA_CORRECTION_WEBGL? > > > The latter certainly sounds less confusing. > > > > 2. What should the default value of this flag be? If it were false, > then for images uploaded from the browser to WebGL, the default > behavior would be for the pixels to be completely untouched. However, > this might be surprising behavior to applications displaying images on > screen with very simple shader code (no lighting) and expecting them > to look the same as surrounding browser content. > > > IMHO this use case would only be likely with WebGL-based image > editing, in most other applications (games, object viewers, etc) the > final pixels might be too transformed through perspective, filtering, > mipmapping, lighting, normal maps, light maps and so on, for slight > gamma correction to really matter, so default should be false for > least surprise when using non-image data. > > However how does the browser typically handle gamma correction? Does > it perform it depending on image metadata? Display color profile? A > mixture or both? > > > Regards, > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cal...@ Thu Sep 2 22:53:24 2010 From: cal...@ (Mark Callow) Date: Fri, 03 Sep 2010 14:53:24 +0900 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: Message-ID: <4C808D54.2050306@hicorp.co.jp> Given that most web images are already "gamma corrected" for CRTs and CRT-emulating LCDs, exactly what "gamma correction"* are web browsers doing? Do they pay attention to the ICC profile or color space information of those formats where it can be embedded (.png, EXIF .jpg, etc)? The correct pipeline for 3D graphics when using an image for texturing is 'convert from image color-space to physically linear space' -> 'apply texture, lights etc.' -> convert from physically linear to display color-space. I am 99.9% sure that the "gamma correction" being done by browsers is not to a physically linear space. If it was the images would look dreadful on an sRGB display, i.e. the vast majority of displays. OpenGL has traditionally ignored color spaces. However recent versions have support for sRGB textures and sRGB framebuffers which implements the above pipeline. These features will almost certainly appear in a future version of OpenGL ES so WebGL needs to be very careful about what it does in this area. * There is a school of thought that says CRTs have nothing that needs to be corrected. They are the perfect decoders for a perceptually-linear encoding of the image data, i.e. an encoding that uses the bits/bandwidth only for changes that are visible to a human eye such as sRGB. "Gamma correctors" are in fact encoders. Regards -Mark On 03/09/2010 11:01, Kenneth Russell wrote: > ... When displaying ordinary images on a web page, the > browser always performs gamma correction on the image data. > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: callow_mark.vcf Type: text/x-vcard Size: 398 bytes Desc: not available URL: From cal...@ Thu Sep 2 23:01:08 2010 From: cal...@ (Mark Callow) Date: Fri, 03 Sep 2010 15:01:08 +0900 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C8070CC.9030702@sjbaker.org> References: <4C8070CC.9030702@sjbaker.org> Message-ID: <4C808F24.6070805@hicorp.co.jp> On 03/09/2010 12:51, Steve Baker wrote: > ... > 3) You are doing no lighting/blending/mipmapping/fog/etc and (for some > reason) you have also chosen not to do gamma correction at the end. In > that case and ONLY in that case, you should gamma-correct your textures > on input. > > I maintain that very few WebGL applications will do (3). I think that the number of mobile devices which have "gamma correctors" is approaching 0 and, with the exception of doing the "correction" in a shader (which will screw up blending), control of any such "correctors" is outside OpenGL. So I suspect the number of applications doing 3 is quite large. Regards -Mark -------------- next part -------------- A non-text attachment was scrubbed... Name: callow_mark.vcf Type: text/x-vcard Size: 412 bytes Desc: not available URL: From ste...@ Fri Sep 3 06:15:11 2010 From: ste...@ (ste...@) Date: Fri, 3 Sep 2010 06:15:11 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C808F24.6070805@hicorp.co.jp> References: <4C8070CC.9030702@sjbaker.org> <4C808F24.6070805@hicorp.co.jp> Message-ID: <75999fb813488db76c8bb94325e4c07f.squirrel@webmail.sjbaker.org> > On 03/09/2010 12:51, Steve Baker wrote: >> ... >> 3) You are doing no lighting/blending/mipmapping/fog/etc and (for some >> reason) you have also chosen not to do gamma correction at the end. In >> that case and ONLY in that case, you should gamma-correct your textures >> on input. >> >> I maintain that very few WebGL applications will do (3). > I think that the number of mobile devices which have "gamma correctors" > is approaching 0 and, with the exception of doing the "correction" in a > shader (which will screw up blending), control of any such "correctors" > is outside OpenGL. So I suspect the number of applications doing 3 is > quite large. I strongly disagree with every single thing you just said! 1) EVERY mobile device that can support WebGL is capable of rendering the final image to the screen (the "compositing" stage) by drawing a textured quadrilateral using a gamma correcting shader. There is no need for custom gamma-correcting CLUT hardware anymore...that's what we have shaders for. 2) If you decided to put the gamma correction at the end of the shader(s) that you use for rendering your 3D scene (which I most certainly don't advocate!), it would indeed "screw up blending" - but less so than applying the gamma to the textures before the shader runs. Gamma is a non-linear effect and as such has to come after all of the linear effects in the rendering pipeline. 3) You say that control of external gamma correctors is outside of OpenGL - that's true but I didn't suggest that we have to use an external gamma corrector. I specifically said that we can fix gamma using a shader in the compositing stage. 4) You can't say how many applications are "doing 3" because there are (by definition) no finished WebGL applications yet (because the specification isn't 100% finished). The only applications that might fall into class (3) are the ones that don't do ANY lighting/anti-aliassing/MIPmapping/texture-magnification/fogging/alpha-blending or translucent-canvas compositing. Basically, every single 3D application is class (1) or (2) - and preferably class (2) because (1) is an ugly kludge. True class (3) applications should probably be using directly. If the specification were to say that the compositor does gamma correction by default (possibly with the option to turn that off for people who don't want it for some very specific reason) then everyone should be happy and we do things correctly without any nasty kludges hardwired into the system. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Fri Sep 3 08:47:59 2010 From: cma...@ (Chris Marrin) Date: Fri, 03 Sep 2010 08:47:59 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: Message-ID: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> On Sep 2, 2010, at 7:01 PM, Kenneth Russell wrote: > In the WebGL specification we've added pixelStorei parameters > UNPACK_FLIP_Y_WEBGL and UNPACK_PREMULTIPLY_ALPHA_WEBGL to handle > common use cases when uploading images from the browser via texImage2D > and texSubImage2D. It appears that we are going to need another one to > determine whether or not gamma correction needs to be applied during > the image upload. When displaying ordinary images on a web page, the > browser always performs gamma correction on the image data. > > WebGL applications that want to encode non-image data in images > definitely do not want gamma correction to be applied; they want the > verbatim pixel data uploaded. For the display of random web images in > WebGL applications it is less clear. I don't know whether most 3D > applications would want to have the gamma-corrected or > non-gamma-corrected pixels uploaded, and in particular any > interactions with shading. > > Questions: > > 1. What should the name of the new (boolean) pixelStorei parameter be? > The name which would most closely match the other parameters would > probably be UNPACK_CORRECT_GAMMA_WEBGL, where "correct" is a verb. > However, this name is probably confusing (why would you ever want > "incorrect" gamma?). UNPACK_PERFORM_GAMMA_CORRECTION_WEBGL? I think UNPACK_GAMMA_CORRECTION_WEBGL is clear enough. Even though it's not a verb, it's shorter and I think the idea gets across. > > 2. What should the default value of this flag be? If it were false, > then for images uploaded from the browser to WebGL, the default > behavior would be for the pixels to be completely untouched. However, > this might be surprising behavior to applications displaying images on > screen with very simple shader code (no lighting) and expecting them > to look the same as surrounding browser content. Still, I am inclined > to suggest that the default for the new flag be false to match most > other OpenGL behavior, where anything other than a pass-through of > data is optional and disabled by default. I think it would be useful to have the unlit case behave the same as rendering to a 2D canvas, which would gamma correct. I believe the differences in the lit case would be subtle and it's only if authors are trying to be very precise that they will care. In that case, they can turn it off. But my opinion on this is not strong. ----- ~Chris cmarrin...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Fri Sep 3 08:51:28 2010 From: cma...@ (Chris Marrin) Date: Fri, 03 Sep 2010 08:51:28 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: Message-ID: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> On Sep 2, 2010, at 7:34 PM, Cedric Vivier wrote: > On Fri, Sep 3, 2010 at 10:01, Kenneth Russell wrote: > Questions: > > 1. What should the name of the new (boolean) pixelStorei parameter be? > The name which would most closely match the other parameters would > probably be UNPACK_CORRECT_GAMMA_WEBGL, where "correct" is a verb. > However, this name is probably confusing (why would you ever want > "incorrect" gamma?). UNPACK_PERFORM_GAMMA_CORRECTION_WEBGL? > > > The latter certainly sounds less confusing. > > > 2. What should the default value of this flag be? If it were false, > then for images uploaded from the browser to WebGL, the default > behavior would be for the pixels to be completely untouched. However, > this might be surprising behavior to applications displaying images on > screen with very simple shader code (no lighting) and expecting them > to look the same as surrounding browser content. > > IMHO this use case would only be likely with WebGL-based image editing, in most other applications (games, object viewers, etc) the final pixels might be too transformed through perspective, filtering, mipmapping, lighting, normal maps, light maps and so on, for slight gamma correction to really matter, so default should be false for least surprise when using non-image data. But I'm not sure how "surprising" it would be to use a gamma corrected image in these cases. With all those manipulations on the texture, I'm not sure if most authors would even notice if the image is gamma corrected or not. So it seems like have it true gives you the more basic case, and switching it to false is a more advanced usage. > > However how does the browser typically handle gamma correction? Does it perform it depending on image metadata? Display color profile? A mixture or both? AFAIK, gamma correction is done to make images look right on the selected display. It has nothing to do with data in the source image. I believe some images might have color correction information in them, but that's different from gamma correction. ----- ~Chris cmarrin...@ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ste...@ Fri Sep 3 10:04:26 2010 From: ste...@ (ste...@) Date: Fri, 3 Sep 2010 10:04:26 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> Message-ID: >> 2. What should the default value of this flag be? If it were false, >> then for images uploaded from the browser to WebGL, the default >> behavior would be for the pixels to be completely untouched. However, >> this might be surprising behavior to applications displaying images on >> screen with very simple shader code (no lighting) and expecting them >> to look the same as surrounding browser content. Still, I am inclined >> to suggest that the default for the new flag be false to match most >> other OpenGL behavior, where anything other than a pass-through of >> data is optional and disabled by default. > > I think it would be useful to have the unlit case behave the same as > rendering to a 2D canvas, which would gamma correct. I believe the > differences in the lit case would be subtle and it's only if authors are > trying to be very precise that they will care. In that case, they can turn > it off. But my opinion on this is not strong. Look I hate to keep going on about this - but this is REALLY important and you're going down completely the wrong track. I have to make yet another empassioned plea to NOT fuck this up for web rendering for all time. If you gamma-correct in the compositor, as I recommend, then everything works right - if you do it ANYWHERE else then you've provided a default behavior that is mathematically and visually incorrect. That would be ridiculous decision. I've seen some bad misunderstandings and misimplementations of gamma correction in the 25 years I've been doing 3D - but the idea of deliberately building incorrect gamma math into a major new graphics standard is quite indefensible...particularly since the "right" solution is easier and more efficient! If you "correct" the gamma on the input images - then that implies that you're NOT going to correct it on the output canvas. If that's the case then even correctly written software that turns off that input processing will fail to have the right gamma on output...to fix that, I'd have to do an additional post effect rendering pass on my final output. But that's a total waste because the browser is going to do a final-final pass when it composites my image into the final screen. THAT is where the gamma should be corrected. It's mathematically the right place - it's the right place for good performance because I don't have to run another pass over the screen to correct the gamma - and it means that every application (even the badly written ones by uncaring users) will get nicely corrected gamma. The differences are far from subtle. Go play any video game with a dark gloomy setting without corrected gamma and you'll immediately see the difference because you'll pretty much be staring at a black screen! You seem to be under the misapprehension that lighting is the only issue here. That couldn't be further from the truth. Even in the simplest, "unlit" case - if you specify a 50% blend of two textures (A and B) or a 50% alpha-blend overlay of one pre-corrected image over another - the answer will be flat out WRONG because: pow(A/2+B/2,gamma) != pow(A/2,gamma)+pow(B/2,gamma) In what way is that "subtle"? It's basic arithmetic - and if you plug in a realistic set of values for A, B and gamma, you'll see that the numerical errors are extreme over some ranges of A and B. If you want, I'll do the analysis for you - but to be honest it's pretty damned obvious! And it's not just to do with lighting. ANY linear operator (add, subtract, multiply, divide, lerp, texture-interpolation, etc) has to happen in linear space before gamma because that's what the hardware does and we can't change that. Pick an even simpler example: If your WebGL code does nothing more than to rotate your image 10 degrees or stretch it by 10% before drawing it, then because the bilinear texel blend that the hardware does operates in linear color space, if you pre-gamma the image then it'll alias. Since both WebKit and FireFox are moving to use OpenGL for compositing, sticking the gamma fix into the compositing shader is a freebie. Now you don't have to mess with the incoming textures and all of the math works out correctly in both the simple case AND the lit case. Suppose we're rendering to a device that needs very different gamma (an inkjet printer, for example). The JavaScript code has no way to find out what the correct gamma for the device is - and even if it could, with the proposed 'fix' it would have to re-download and have WebGL re-convert all of the textures. But doing it in compositing means that the browser can provide a user-option to set the desired gamma - and for it to default to 2.2 for the CRT and whatever it needs to be for printing. Doing it in the compositor is pretty much free - and mathematically correct - and convenient for printing - and applies to every application by default. Doing what you are proposing is just bad mathematics, inept graphics and an incredibly poor standardization decision. We can allow fucked up math as an option - but there is no way it should be the default. Please, let's do this right. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Fri Sep 3 10:16:14 2010 From: cma...@ (Chris Marrin) Date: Fri, 03 Sep 2010 10:16:14 -0700 Subject: [Public WebGL] Proposed change to WebGL Event definition In-Reply-To: <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> References: <33F15A1B-FF87-44F1-ABA3-632494FA649A@apple.com> <817D1DD8-125C-44B0-95FF-C288EECB06E0@apple.com> <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> Message-ID: I've revised the event section (5.14). Please review. ----- ~Chris cmarrin...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From enn...@ Fri Sep 3 10:42:45 2010 From: enn...@ (Adrienne Walker) Date: Fri, 3 Sep 2010 10:42:45 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> Message-ID: El 3 de septiembre de 2010 08:47, Chris Marrin escribi?: > > I think it would be useful to have the unlit case behave the same as > rendering to a 2D canvas, which would gamma correct. I believe the > differences in the lit case would be subtle and it's only if authors are > trying to be very precise that they will care. In that case, they can > turn it off. But my opinion on this is not strong. I agree that the unlit case should ideally behave the same as rendering to a 2D canvas. However, as Steve points out, this would be much better implemented as a context creation attribute that the compositor could respect. ?It could default to having gamma correction turned on. Additionally, if you need a packing flag for texture loads, I think the most useful operation is the opposite of the one proposed--to transform non-linear input textures into the appropriate linear space for lighting. ?Using non-linear textures as storage and input arguably gives you more color resolution in the dark part of the spectrum, so it might be useful to support that. D3DSAMP_SRGBTEXTURE is an example of this sort of texture load flag. -enne ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Fri Sep 3 11:39:09 2010 From: ste...@ (ste...@) Date: Fri, 3 Sep 2010 11:39:09 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> Message-ID: <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> > El 3 de septiembre de 2010 08:47, Chris Marrin > escribi?: >> >> I think it would be useful to have the unlit case behave the same as >> rendering to a 2D canvas, which would gamma correct. I believe the >> differences in the lit case would be subtle and it's only if authors are >> trying to be very precise that they will care. In that case, they can >> turn it off. But my opinion on this is not strong. > > I agree that the unlit case should ideally behave the same as > rendering to a 2D canvas. However, as Steve points out, this would be > much better implemented as a context creation attribute that the > compositor could respect. ?It could default to having gamma correction > turned on. > > Additionally, if you need a packing flag for texture loads, I think > the most useful operation is the opposite of the one proposed--to > transform non-linear input textures into the appropriate linear space > for lighting. ?Using non-linear textures as storage and input arguably > gives you more color resolution in the dark part of the spectrum, so > it might be useful to support that. D3DSAMP_SRGBTEXTURE is an example > of this sort of texture load flag. > > -enne Yes - the reverse operation (to turn a pre-gamma-corrected image into a linear color space texture) is much more useful - especially in an environment where JPEG images are common and we might wish to take as input other things that the browser has generated that might already be gamma corrected. At first sight, fixing pre-gamma'd images back to linear seems do-able in the shader. (Since the gamma operation is Vout=pow(Vin,1.0/2.2) - the inverse of that is Vin = pow(Vout,2.2)...which you can approximate as pow(Vout,2.0) - which is Vin=Vout*Vout). However, you can only do that after texture-lookup - and because that entails a bunch of linear interpolations, you shouldn't really be doing that in gamma-space. So there is certainly justification for reversing the gamma correction as the texture is loaded. Moreover, many image file formats actually tell you what gamma they were stored with - so the loader could do a really excellent job by honoring that number. -- Steve -- Steve -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From bja...@ Fri Sep 3 11:51:57 2010 From: bja...@ (Benoit Jacob) Date: Fri, 3 Sep 2010 11:51:57 -0700 (PDT) Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> Message-ID: <894264702.470377.1283539917988.JavaMail.root@cm-mail03.mozilla.org> ----- Original Message ----- > > El 3 de septiembre de 2010 08:47, Chris Marrin > > escribi?: > >> > >> I think it would be useful to have the unlit case behave the same > >> as > >> rendering to a 2D canvas, which would gamma correct. I believe the > >> differences in the lit case would be subtle and it's only if > >> authors are > >> trying to be very precise that they will care. In that case, they > >> can > >> turn it off. But my opinion on this is not strong. > > > > I agree that the unlit case should ideally behave the same as > > rendering to a 2D canvas. However, as Steve points out, this would > > be > > much better implemented as a context creation attribute that the > > compositor could respect. It could default to having gamma > > correction > > turned on. > > > > Additionally, if you need a packing flag for texture loads, I think > > the most useful operation is the opposite of the one proposed--to > > transform non-linear input textures into the appropriate linear > > space > > for lighting. Using non-linear textures as storage and input > > arguably > > gives you more color resolution in the dark part of the spectrum, so > > it might be useful to support that. D3DSAMP_SRGBTEXTURE is an > > example > > of this sort of texture load flag. > > > > -enne > > Yes - the reverse operation (to turn a pre-gamma-corrected image into > a > linear color space texture) is much more useful - especially in an > environment where JPEG images are common and we might wish to take as > input other things that the browser has generated that might already > be > gamma corrected. > > At first sight, fixing pre-gamma'd images back to linear seems do-able > in > the shader. > > (Since the gamma operation is Vout=pow(Vin,1.0/2.2) - the inverse of > that > is Vin = pow(Vout,2.2)...which you can approximate as pow(Vout,2.0) - > which is Vin=Vout*Vout). Doesn't GLSL ES have pow(x,y) ? Even if we couldn't rely on a pow(x,y) function being available we could still approximate it with basic arithmetic without having to do such an approximation as 2.2 ~= 2.0. For example, could do pow(x,y) = exp(y*log(x)) and if needed implement exp() and log() using polynomial approximations (taylor series for exp and Pade approximant for log). Benoit > > However, you can only do that after texture-lookup - and because that > entails a bunch of linear interpolations, you shouldn't really be > doing > that in gamma-space. So there is certainly justification for reversing > the gamma correction as the texture is loaded. Moreover, many image > file > formats actually tell you what gamma they were stored with - so the > loader > could do a really excellent job by honoring that number. > > -- Steve > > > -- Steve > > > -- Steve > > > ----------------------------------------------------------- > You are currently subscribed to public_webgl...@ > To unsubscribe, send an email to majordomo...@ with > the following command in the body of your email: ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From kbr...@ Fri Sep 3 13:58:27 2010 From: kbr...@ (Kenneth Russell) Date: Fri, 3 Sep 2010 13:58:27 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> Message-ID: On Fri, Sep 3, 2010 at 11:39 AM, wrote: >> El 3 de septiembre de 2010 08:47, Chris Marrin >> escribi?: >>> >>> I think it would be useful to have the unlit case behave the same as >>> rendering to a 2D canvas, which would gamma correct. I believe the >>> differences in the lit case would be subtle and it's only if authors are >>> trying to be very precise that they will care. In that case, they can >>> turn it off. But my opinion on this is not strong. >> >> I agree that the unlit case should ideally behave the same as >> rendering to a 2D canvas. However, as Steve points out, this would be >> much better implemented as a context creation attribute that the >> compositor could respect. ?It could default to having gamma correction >> turned on. >> >> Additionally, if you need a packing flag for texture loads, I think >> the most useful operation is the opposite of the one proposed--to >> transform non-linear input textures into the appropriate linear space >> for lighting. ?Using non-linear textures as storage and input arguably >> gives you more color resolution in the dark part of the spectrum, so >> it might be useful to support that. ?D3DSAMP_SRGBTEXTURE is an example >> of this sort of texture load flag. >> >> -enne > > Yes - the reverse operation (to turn a pre-gamma-corrected image into a > linear color space texture) is much more useful - especially in an > environment where JPEG images are common and we might wish to take as > input other things that the browser has generated that might already be > gamma corrected. > > At first sight, fixing pre-gamma'd images back to linear seems do-able in > the shader. > > (Since the gamma operation is ?Vout=pow(Vin,1.0/2.2) - the inverse of that > is Vin = pow(Vout,2.2)...which you can approximate as pow(Vout,2.0) - > which is Vin=Vout*Vout). > > However, you can only do that after texture-lookup - and because that > entails a bunch of linear interpolations, you shouldn't really be doing > that in gamma-space. ?So there is certainly justification for reversing > the gamma correction as the texture is loaded. ?Moreover, many image file > formats actually tell you what gamma they were stored with - so the loader > could do a really excellent job by honoring that number. Based on your above descriptions and the above discussion I'm well convinced that the default behavior should not be to apply gamma correction to images uploaded via tex{Sub}Image2D. I don't yet understand what we'll need to do in order to support this though. For RGB(A) PNGs, is the desired behavior to simply pass through the pixel values in the file without regard to any gamma information in the file or the screen gamma? Or is the conversion to a linear color space more complex? I don't know where all of the places are in WebKit code which may end up modifying pixel values during image loading. Here's the code from WebKit's PNGImageDecoder.cpp that sets up the gamma in the PNG reader. // Gamma constants. const double cMaxGamma = 21474.83; const double cDefaultGamma = 2.2; const double cInverseGamma = 0.45455; // Deal with gamma and keep it under our control. double gamma; if (png_get_gAMA(png, info, &gamma)) { if ((gamma <= 0.0) || (gamma > cMaxGamma)) { gamma = cInverseGamma; png_set_gAMA(png, info, gamma); } png_set_gamma(png, cDefaultGamma, gamma); } else png_set_gamma(png, cDefaultGamma, cInverseGamma); If we want to pass through the data unmodified, would we want to call png_set_gamma(png, 1.0, 1.0)? Similarly, to convert to linear space, would we want to pass 1.0 instead of cDefaultGamma? I see nothing in the JPEGImageDecoder related to gamma. Is anything needed for this file format? I suspect people will not use JPEGs for anything they expect to be passed through verbatim to WebGL, such as encoding non-color information in the color channels of a texture. Do we need three values for this pixel storage attribute (pass through data verbatim, convert to linear space, and perform gamma correction)? Similarly, it sounds like we need another context creation attribute to optionally gamma correct WebGL's rendering results before placing them on the page? -Ken P.S. Steve, your earlier email is my favorite ever. ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Fri Sep 3 15:06:08 2010 From: cma...@ (Chris Marrin) Date: Fri, 03 Sep 2010 15:06:08 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> Message-ID: <4F138B1C-0A7C-4955-81AA-8906294266BA@apple.com> First of all let me say a couple of things: 1) Steve, how do you REALLY feel about gamma? 2) Ken, (regarding the question of whether anyone on the list actually cares about gamma), told you so. :-) Now on to the topic at hand. First let's try to narrow the scope of this discussion. We're not talking about printers or anything else. We're talking about rendering imagery into a WebGL canvas for later compositing with the rest of the page. I think we should take our lead from what the 2D Canvas says. Section 4.8.11.2 of http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html talks about this. I find it hard to read, but I believe it says that the pixels stored in the canvas are in the color space of the canvas, which is sRGB. So if you read back those pixels using getImageData(), they will be in the sRGB color space. And when you call toDataURL(), the encoded pixels values will be the same as those returned in getImageData(). In fact, the 2D Canvas spec doesn't really speak in terms of gamma correction at all. It speaks in terms of color spaces. It says that the color space can be transformed in exactly 2 places: 1) going from whatever the incoming image's color space is to sRGB for insertion into the canvas, and 2) going from sRGB in the canvas to whatever the color space of the display happens to be. I think gamma correction is just a detail of the display's color space, so we probably shouldn't even be using that term. I think it would be better if we simply say whether we want an image to be in the sRGB color space in texture memory, or unchanged from the original image. We should speak in terms of the original image's color space, because there are image formats which specify it. All that is a pretty clear indication that the pixels in the canvas are expected to be in the sRGB color space and when they are composited they are transformed into the display's color space. An author who really cares, can render textures into the WebGL canvas knowing the image is in the sRGB space and that the final image in the canvas should be in the sRGB space, and apply the appropriate factors to make that so. So my proposal is to call the flag something like IMAGE_COLORSPACE_WEBGL with the values IMAGE_COLORSPACE_SRGB_WEBGL and IMAGE_COLORSPACE_RAW_WEBGL. I think using enumerations make it the most clear. And given the argument above, I think the default should clearly be IMAGE_COLORSPACE_SRGB_WEBGL. If the author is dealing with textures as images (as opposed to some other type of data, like normal maps or floats) then all you have to know is the source and destination color spaces and you can make the proper calculations. As far as giving the ability to control the compositing of the output (like we do for premultiplied alpha), I don't think we need to. We just need to say that the pixels in the drawing buffer have to be sRGB. We can see if I have been convincing by the number of expletives in Steve's response :-) ----- ~Chris cmarrin...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Fri Sep 3 16:14:23 2010 From: ste...@ (Steve Baker) Date: Fri, 03 Sep 2010 18:14:23 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> Message-ID: <4C81814F.4050504@sjbaker.org> Kenneth Russell wrote: > On Fri, Sep 3, 2010 at 11:39 AM, wrote: > >>> El 3 de septiembre de 2010 08:47, Chris Marrin >>> escribi?: >>> >>>> I think it would be useful to have the unlit case behave the same as >>>> rendering to a 2D canvas, which would gamma correct. I believe the >>>> differences in the lit case would be subtle and it's only if authors are >>>> trying to be very precise that they will care. In that case, they can >>>> turn it off. But my opinion on this is not strong. >>>> >>> I agree that the unlit case should ideally behave the same as >>> rendering to a 2D canvas. However, as Steve points out, this would be >>> much better implemented as a context creation attribute that the >>> compositor could respect. It could default to having gamma correction >>> turned on. >>> >>> Additionally, if you need a packing flag for texture loads, I think >>> the most useful operation is the opposite of the one proposed--to >>> transform non-linear input textures into the appropriate linear space >>> for lighting. Using non-linear textures as storage and input arguably >>> gives you more color resolution in the dark part of the spectrum, so >>> it might be useful to support that. D3DSAMP_SRGBTEXTURE is an example >>> of this sort of texture load flag. >>> >>> -enne >>> >> Yes - the reverse operation (to turn a pre-gamma-corrected image into a >> linear color space texture) is much more useful - especially in an >> environment where JPEG images are common and we might wish to take as >> input other things that the browser has generated that might already be >> gamma corrected. >> >> At first sight, fixing pre-gamma'd images back to linear seems do-able in >> the shader. >> >> (Since the gamma operation is Vout=pow(Vin,1.0/2.2) - the inverse of that >> is Vin = pow(Vout,2.2)...which you can approximate as pow(Vout,2.0) - >> which is Vin=Vout*Vout). >> >> However, you can only do that after texture-lookup - and because that >> entails a bunch of linear interpolations, you shouldn't really be doing >> that in gamma-space. So there is certainly justification for reversing >> the gamma correction as the texture is loaded. Moreover, many image file >> formats actually tell you what gamma they were stored with - so the loader >> could do a really excellent job by honoring that number. >> > > Based on your above descriptions and the above discussion I'm well > convinced that the default behavior should not be to apply gamma > correction to images uploaded via tex{Sub}Image2D. I don't yet > understand what we'll need to do in order to support this though. > > For RGB(A) PNGs, is the desired behavior to simply pass through the > pixel values in the file without regard to any gamma information in > the file or the screen gamma? Or is the conversion to a linear color > space more complex? > > I don't know where all of the places are in WebKit code which may end > up modifying pixel values during image loading. Here's the code from > WebKit's PNGImageDecoder.cpp that sets up the gamma in the PNG reader. > > // Gamma constants. > const double cMaxGamma = 21474.83; > const double cDefaultGamma = 2.2; > const double cInverseGamma = 0.45455; > // Deal with gamma and keep it under our control. > double gamma; > if (png_get_gAMA(png, info, &gamma)) { > if ((gamma <= 0.0) || (gamma > cMaxGamma)) { > gamma = cInverseGamma; > png_set_gAMA(png, info, gamma); > } > png_set_gamma(png, cDefaultGamma, gamma); > } else > png_set_gamma(png, cDefaultGamma, cInverseGamma); > > If we want to pass through the data unmodified, would we want to call > png_set_gamma(png, 1.0, 1.0)? Similarly, to convert to linear space, > would we want to pass 1.0 instead of cDefaultGamma? > > I see nothing in the JPEGImageDecoder related to gamma. Is anything > needed for this file format? I suspect people will not use JPEGs for > anything they expect to be passed through verbatim to WebGL, such as > encoding non-color information in the color channels of a texture. > > Do we need three values for this pixel storage attribute (pass through > data verbatim, convert to linear space, and perform gamma correction)? > > Similarly, it sounds like we need another context creation attribute > to optionally gamma correct WebGL's rendering results before placing > them on the page? > > -Ken > > P.S. Steve, your earlier email is my favorite ever. > Well the current rules are something like this: * The PNG file format stores things in linear color space. If you plan to display them on a NON gamma corrected medium - then you need to apply gamma to it...which (I presume) is what that snippet of code that you presented actually does. * The JPEG format stores things in gamma space (because it allow denser lossy-compression). When you simply display a JPEG (as is typically the case in a browser), you don't do anything more to it...which is why you can't find anything in the JPEG decoder. However, that's only true when you're going to do NOTHING whatever to the image on its way to the display. If you plan to do (linear) math on it (blending, MIPmapping, lighting, etc) then you have to have everything in linear color space because our hardware can't do that stuff in gamma space. So what we need to do is to pass things in linear color space to the shaders - let the graphics pipeline do it's thing in linear color space - and then, at the very end of the process - perform gamma correction. In the case of WebGL - doing gamma correction in the compositor is virtually a freebie (providing everyone carries through with plans to do compositing using the GPU). Hence, for 3D rendering (or anything else that requires processing of the image data) we have to reverse those two rules: * PNG files will not need any processing...they are in linear color space, they go through the shader as linear...then are gamma corrected (I propose) in the final compositor stage. Very clean, efficient and with minimal roundoff issues and mathematically correct gamma handling. * Sadly, JPEG files are now a problem...they are stored in gamma space - so we must REVERSE-gamma-correct them as we load them: Vout=pow(Vin,2.2); ...to turn them back into linear space - then pass them through the shader in linear space - and finally, gamma correct the result in the compositor using Vout=pow(Vin,1/2.2); This preserves the existing behavior - but does the gamma correction AFTER all of the linear processing and not before...HOORAY!! That's a massive quality win for PNG (and most other image file formats) - but it's not so great for JPEG. Doing reverse-gamma then rendering then doing forward-gamma is not nice. But (roundoff error aside) this does preserve the current "canvas" behavior. However, vanilla JPEG is highly ill-suited for use as 3D texture maps - for many other reasons. (My ancient rant on this subject is here: http://www.sjbaker.org/steve/omniv/jpegs_are_evil_too.html) Other file formats are trickier. GIF (IIRC) makes no comment about gamma - you can't tell whether it's gamma-corrected or not. BMP is just an unholy mess - a BMP can be just a wrapper for a JPEG or a PNG or some Microsoft-specific mess. Without a lot of messy decoding it could be just about anything...a typical Microsoftian pig's breakfast! TGA files are normally in linear space - but there is an extension that supports gamma-corrected files. I don't think I've ever seen one that was. I think if you reverse-gamma JPEG files and leave everything else alone, you'll be OK. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From oli...@ Fri Sep 3 16:26:50 2010 From: oli...@ (Oliver Hunt) Date: Fri, 3 Sep 2010 16:26:50 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C81814F.4050504@sjbaker.org> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> <4C81814F.4050504@sjbaker.org> Message-ID: The way browsers currently handle colour space is fairly simple, the actual format does not matter: * The image includes a colour profile: then use that colour profile * The image does not include a colour profile: then assume the image is in sRGB This is the model that WebGL texture loading should follow in order to be consistent with other web content. Obviously WebGL will need an API to load raw (un "corrected") data into a texture in order to handle a few use cases, but other than that additional API there shouldn't be any other logic necessary. Basically the complete (image to display) model is: Image WebGL Context Display -> match -> Linear RGB -> match -> With an option to do Image WebGL Context Display -> nop -> Linear RGB -> match -> The actual file formats, and the actual mechanism of matching isn't relevant, all that we need to do is guarantee the colour space of the GL context. --Oliver -------------- next part -------------- An HTML attachment was scrubbed... URL: From ste...@ Fri Sep 3 16:53:59 2010 From: ste...@ (Steve Baker) Date: Fri, 03 Sep 2010 18:53:59 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4F138B1C-0A7C-4955-81AA-8906294266BA@apple.com> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> <4F138B1C-0A7C-4955-81AA-8906294266BA@apple.com> Message-ID: <4C818A97.6070303@sjbaker.org> Chris Marrin wrote: > First of all let me say a couple of things: > > 1) Steve, how do you REALLY feel about gamma? > > 2) Ken, (regarding the question of whether anyone on the list actually cares about gamma), told you so. > > :-) > > Now on to the topic at hand. First let's try to narrow the scope of this discussion. We're not talking about printers or anything else. We're talking about rendering imagery into a WebGL canvas for later compositing with the rest of the page. > Why aren't you talking about printing? People print web pages all the time - that will (in future) include WebGL canvasses - and since the gamma of a printer is WAY different to that of a screen - you can't ignore that. > I think we should take our lead from what the 2D Canvas says. Section 4.8.11.2 of http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html talks about this. I find it hard to read, but I believe it says that the pixels stored in the canvas are in the color space of the canvas, which is sRGB. So if you read back those pixels using getImageData(), they will be in the sRGB color space. And when you call toDataURL(), the encoded pixels values will be the same as those returned in getImageData(). > Yeah - that's quite some incomprehensible piece of writing! Let's break apart the "color spaces and color correction" section (4.8.11.2). /The |canvas | APIs must perform color correction at only two points: when rendering images with their own gamma correction and color space information onto the canvas, to convert the image to the color space used by the canvas (e.g. using the 2D Context's |drawImage() | method with an |HTMLImageElement | object), and when rendering the actual canvas bitmap to the output device./ So two kinds of correction: 1) "images with their own gamma correction" (ie JPEG) - convert the image to the color space of the canvas. 2) "...when rendering the actual canvas bitmap to the output device" This is PRECISELY what I'm asking for here. If we say that our WebGL canvas is in linear space then (1) Says convert gamma-corrected JPEGs to the color space of the canvas (which is linear - so we can do lighting, etc) - no need to convert PNGs because they are already linear. (2) Says we do gamma correction in the compositor. Perfect! Exactly what I've been telling everyone we should do! Hooray! Then it says: /In user agents that support CSS, the color space used by a |canvas | element must match the color space used for processing any colors for that element in CSS./ So this says that the CSS system can impose some other color space on the canvas? It's not really clear what that means...but if somehow CSS told the canvas to be a gamma-space canvas - then you'd have to preconvert PNG's into gamma space then NOT do gamma correction in the compositor. But I don't see how CSS imposes itself on WebGL. Colors from CSS for our rendering surface surely aren't relevent? That leaves us free to choose our color space - and because we're not totally insane - we pick "linear". Then it concludes says: /The gamma correction and color space information of images must be handled in such a way that an image rendered directly using an |img | element would use the same colors as one painted on a |canvas | element that is then itself rendered. / That's fine - if you take PNG, and DO NOT gamma correct it - do a straightforward linear-space rendering and then gamma-correct the output, you get exactly the right answer. If you take a JPEG, reverse-gamma correct it, do a linear-space rendering and then gamma-correct the output, you get (barring roundoff issues) the right thing. /Furthermore, the rendering of images that have no color correction information (such as those returned by the |toDataURL() | method) must be rendered with no color correction. / That I don't understand....? > In fact, the 2D Canvas spec doesn't really speak in terms of gamma correction at all. It speaks in terms of color spaces. True - but there are (typically) only two color spaces that we care about...gamma and linear. Sure, you might be dealing with printers and have a CMYK color space canvas...that would be kinda silly though. > It says that the color space can be transformed in exactly 2 places: 1) going from whatever the incoming image's color space is to sRGB for insertion into the canvas, and 2) going from sRGB in the canvas to whatever the color space of the display happens to be. > I think gamma correction is just a detail of the display's color space, so we probably shouldn't even be using that term. I think it would be better if we simply say whether we want an image to be in the sRGB color space in texture memory, or unchanged from the original image. We should speak in terms of the original image's color space, because there are image formats which specify it. > We take the color space of the image - we convert it to whatever our canvas needs then we convert that to whatever the display needs at the output. That's what I've been proposing all along - and that seems to be what the canvas spec says. The color space of the WebGL canvas needs to be linear RGB because our hardware can't process anything else correctly - and that would break the canvas specification. The images could be in who-knows-what space - but we should correct them to linear for OUR canvas. Then we convert our canvas into who-knows-what that the display may need. In practical terms - JPEG gets reverse-gamma'd - PNGs are left alone. We gamma correct for the display in the compositor - if the output is a screen. If the output is a printer then we apply different gamma - or we convert to CMYK space or whatever. > All that is a pretty clear indication that the pixels in the canvas are expected to be in the sRGB color space and when they are composited they are transformed into the display's color space. An author who really cares, can render textures into the WebGL canvas knowing the image is in the sRGB space and that the final image in the canvas should be in the sRGB space, and apply the appropriate factors to make that so. But our hardware can't process sRGB. So that's a complete non-starter - but fortunately, the canvas spec allows us to choose the color space of our canvas providing we convert on input (where necessary...ie JPEGS) - and providing we gamma correct on the output (which we MUST do in order to make things we render like lighting etc compatible with the color space we have to use). > So my proposal is to call the flag something like IMAGE_COLORSPACE_WEBGL with the values IMAGE_COLORSPACE_SRGB_WEBGL and IMAGE_COLORSPACE_RAW_WEBGL. I think using enumerations make it the most clear. And given the argument above, I think the default should clearly be IMAGE_COLORSPACE_SRGB_WEBGL. If the author is dealing with textures as images (as opposed to some other type of data, like normal maps or floats) then all you have to know is the source and destination color spaces and you can make the proper calculations. > > As far as giving the ability to control the compositing of the output (like we do for premultiplied alpha), I don't think we need to. We just need to say that the pixels in the drawing buffer have to be sRGB. > ...and thereby make every single GPU on the planet non-compliant. Good move...very clever! GPU's can't even read sRGB textures without going to GL_NEAREST filtering because GL_LINEAR (et al) requires non-linear blending operations that absolutely nobody implements. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Fri Sep 3 17:54:21 2010 From: ste...@ (Steve Baker) Date: Fri, 03 Sep 2010 19:54:21 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> <4C81814F.4050504@sjbaker.org> Message-ID: <4C8198BD.5060309@sjbaker.org> Oliver Hunt wrote: > The way browsers currently handle colour space is fairly simple, the > actual format does not matter: > > * The image includes a colour profile: then use that colour profile > * The image does not include a colour profile: then assume the image > is in sRGB > > This is the model that WebGL texture loading should follow in order to > be consistent with other web content. > > Obviously WebGL will need an API to load raw (un "corrected") data > into a texture in order to handle a few use cases, but other than that > additional API there shouldn't be any other logic necessary. > > Basically the complete (image to display) model is: > > Image WebGL Context > Display > -> match -> Linear RGB -> match -> > > > With an option to do > Image WebGL Context > Display > -> nop -> Linear RGB -> match -> > > > The actual file formats, and the actual mechanism of matching isn't > relevant, all that we need to do is guarantee the colour space of the > GL context. > > --Oliver Yes - that's workable. I'm a little concerned about the meaning of if the "image does not include a color profile: then assume the image is in sRGB"...that's going to cause a bunch of grief to people who don't have an advanced degree in specification-parsing. But so long as we can override the Image=>WebGL conversion painlessly, I'll just do that unconditionally and all will be well. The critical parts are that we assume that the WebGL context is ALWAYS a linear color space (because that's all our hardware will ever be) - and that (as a consequence) something after that in the pipeline applies a decent gamma value...which would most efficiently be a GPU-based compositing step. Great! -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From oli...@ Fri Sep 3 17:54:57 2010 From: oli...@ (Oliver Hunt) Date: Fri, 3 Sep 2010 17:54:57 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C8198BD.5060309@sjbaker.org> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> <4C81814F.4050504@sjbaker.org> <4C8198BD.5060309@sjbaker.org> Message-ID: <1299E654-AF8E-410C-960D-4F0E4FCB4F1D@apple.com> On Sep 3, 2010, at 5:54 PM, Steve Baker wrote: > Oliver Hunt wrote: >> The way browsers currently handle colour space is fairly simple, the >> actual format does not matter: >> >> * The image includes a colour profile: then use that colour profile >> * The image does not include a colour profile: then assume the image >> is in sRGB >> >> This is the model that WebGL texture loading should follow in order to >> be consistent with other web content. >> >> Obviously WebGL will need an API to load raw (un "corrected") data >> into a texture in order to handle a few use cases, but other than that >> additional API there shouldn't be any other logic necessary. >> >> Basically the complete (image to display) model is: >> >> Image WebGL Context >> Display >> -> match -> Linear RGB -> match -> >> >> >> With an option to do >> Image WebGL Context >> Display >> -> nop -> Linear RGB -> match -> >> >> >> The actual file formats, and the actual mechanism of matching isn't >> relevant, all that we need to do is guarantee the colour space of the >> GL context. >> >> --Oliver > Yes - that's workable. > > I'm a little concerned about the meaning of if the "image does not > include a color profile: then assume the image is in sRGB"...that's > going to cause a bunch of grief to people who don't have an advanced > degree in specification-parsing. But so long as we can override the > Image=>WebGL conversion painlessly, I'll just do that unconditionally > and all will be well. > > The critical parts are that we assume that the WebGL context is ALWAYS a > linear color space (because that's all our hardware will ever be) - and > that (as a consequence) something after that in the pipeline applies a > decent gamma value...which would most efficiently be a GPU-based > compositing step. Gamma correction is (from the PoV of the GL context) simply an aspect of the display's colour profile so doesn't effect the WebGL spec. --Oliver ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Fri Sep 3 18:34:35 2010 From: ste...@ (stephen white) Date: Sat, 4 Sep 2010 11:04:35 +0930 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C818A97.6070303@sjbaker.org> References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> <29c135daaf4793a5e64da07193a81e15.squirrel@webmail.sjbaker.org> <4F138B1C-0A7C-4955-81AA-8906294266BA@apple.com> <4C818A97.6070303@sjbaker.org> Message-ID: <896A8D29-A9A7-43B5-B421-2916093A3B5A@adam.com.au> On 04/09/2010, at 9:23 AM, Steve Baker wrote: > That's fine - if you take PNG, and DO NOT gamma correct it - do a > straightforward linear-space rendering and then gamma-correct the > output, you get exactly the right answer. If you take a JPEG, > reverse-gamma correct it, do a linear-space rendering and then > gamma-correct the output, you get (barring roundoff issues) the right thing. It seems to me that this issue is surfacing because browsers can select to apply gamma or not on a per image basis, but that ability is lost in WebGL due to multiple textures being used before the page is composited (with gamma correction). I'm a little confused on a point though, as I thought it was the screen that is gamma corrected. There's a further complicating factor in that image scaling also involves gamma: http://www.4p8.com/eric.brasseur/gamma.html So yes, Chris Marrin is right to point out that we need to talk about colour spaces rather than gamma. So I'm wondering if PNG vs JPEG colour spaces need to be on a per-texture basis, rather than glReadPixels after everything is done? Eg, as Oliver said: >> * The image includes a colour profile: then use that colour profile >> * The image does not include a colour profile: then assume the image >> is in sRGB So to handle the per-texture issue, the transform would need to happen on draw rather than load? This sounds more correct, but also a hell of a lot more painful to actually implement (as well as the speed cost for a frequent operation). I'm not saying anything useful here, just trying to get a picture (ha) of the issue. :) -- steve...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From kbr...@ Fri Sep 3 18:40:00 2010 From: kbr...@ (Kenneth Russell) Date: Fri, 3 Sep 2010 18:40:00 -0700 Subject: [Public WebGL] Proposed change to WebGL Event definition In-Reply-To: References: <33F15A1B-FF87-44F1-ABA3-632494FA649A@apple.com> <817D1DD8-125C-44B0-95FF-C288EECB06E0@apple.com> <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> Message-ID: On Fri, Sep 3, 2010 at 10:16 AM, Chris Marrin wrote: > > I've revised the event section (5.14). Please review. Sorry for not realizing this before, but the "NOT_AVAILABLE" status code seems pretty useless, because if the web browser is so old that it doesn't support WebGL, then it definitely won't support delivery of the webglcontextcreationerror event. If you buy that argument, then the only two status codes are NOT_SUPPORTED and OTHER_ERROR, and I'd argue that nobody is going to take programmatic action based on one or the other. Therefore we could perhaps dump the status codes completely and just leave the status message. -Ken > > ----- > ~Chris > cmarrin...@ > > > > > ----------------------------------------------------------- > You are currently subscribed to public_webgl...@ > To unsubscribe, send an email to majordomo...@ with > the following command in the body of your email: > > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ced...@ Fri Sep 3 21:54:57 2010 From: ced...@ (Cedric Vivier) Date: Sat, 4 Sep 2010 12:54:57 +0800 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> Message-ID: On Fri, Sep 3, 2010 at 23:51, Chris Marrin wrote: > AFAIK, gamma correction is done to make images look right on the selected > display. It has nothing to do with data in the source image. I believe some > images might have color correction information in them, but that's different > from gamma correction. > I think this contradicts the related paragraph in the canvas 2D context spec : http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html#color-spaces-and-color-correction Canvas 2D is clearly supposed to perform gamma correction only on images that have their own color correction information, I assume WebGL should only do color/gamma correction when unpacking textures under the same rule. This would actually render the UNPACK_* parameter almost useless as it could (and probably should) be the default. If developers do not want gamma correction they just have to use images without color correction information in them (which would already be the case for any non-diffuse texture anyways). Regards, -------------- next part -------------- An HTML attachment was scrubbed... URL: From ced...@ Fri Sep 3 22:08:31 2010 From: ced...@ (Cedric Vivier) Date: Sat, 4 Sep 2010 13:08:31 +0800 Subject: [Public WebGL] Proposed change to WebGL Event definition In-Reply-To: References: <33F15A1B-FF87-44F1-ABA3-632494FA649A@apple.com> <817D1DD8-125C-44B0-95FF-C288EECB06E0@apple.com> <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> Message-ID: On Sat, Sep 4, 2010 at 09:40, Kenneth Russell wrote: > On Fri, Sep 3, 2010 at 10:16 AM, Chris Marrin wrote: > > > > I've revised the event section (5.14). Please review. > > Sorry for not realizing this before, but the "NOT_AVAILABLE" status > code seems pretty useless, because if the web browser is so old that > it doesn't support WebGL, then it definitely won't support delivery of > the webglcontextcreationerror event. > NOT_AVAILABLE "WebGL is not implemented or not enabled in this browser" I think the current meaning is confusing (and indeed useless). IIRC the original planned intent for this status code, meaning should be something like : "WebGL is currently disabled or temporarily unavailable because of high resource usage by other tabs/programs." Which makes it a bit more useful as it allows apps to present the user a more helpful message (and ask to try again - something not to do with NOT_SUPPORTED). Regards, -------------- next part -------------- An HTML attachment was scrubbed... URL: From ced...@ Fri Sep 3 23:21:11 2010 From: ced...@ (Cedric Vivier) Date: Sat, 4 Sep 2010 14:21:11 +0800 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <26BED01D-1031-4075-A054-16EF2D7B7BBD@apple.com> Message-ID: On Sat, Sep 4, 2010 at 01:42, Adrienne Walker wrote: > I agree that the unlit case should ideally behave the same as > rendering to a 2D canvas. However, as Steve points out, this would be > much better implemented as a context creation attribute that the > compositor could respect. It could default to having gamma correction > turned on. > > The canvas 2D context does both actually afaik. > Additionally, if you need a packing flag for texture loads, I think > the most useful operation is the opposite of the one proposed--to > transform non-linear input textures into the appropriate linear space > for lighting. Using non-linear textures as storage and input arguably > gives you more color resolution in the dark part of the spectrum, so > it might be useful to support that. D3DSAMP_SRGBTEXTURE is an example > of this sort of texture load flag. I agree, the more I look at it the more pixel unpacking parameter seems a weird place to do this. If you have a look at EXT_texture_sRGB extension ( http://www.opengl.org/registry/specs/EXT/texture_sRGB.txt ) it has been resolved that a texture format is the right way to do it. As Mark already warned about, WebGL should not diverge from what OpenGL does, I'd think it would be preferable, for instance, to support (a subset of) that extension and allow users to specify SRGB8 format on textures they want color correction on (a WebGL implementation can perform the conversion from linear color space before sending actual texture data to ES 2.0 [without that extension] - similar to what it does already when converting to RGB565 and others). Regards, Regards, -------------- next part -------------- An HTML attachment was scrubbed... URL: From cma...@ Sat Sep 4 06:42:41 2010 From: cma...@ (Chris Marrin) Date: Sat, 04 Sep 2010 06:42:41 -0700 Subject: [Public WebGL] Proposed change to WebGL Event definition In-Reply-To: References: <33F15A1B-FF87-44F1-ABA3-632494FA649A@apple.com> <817D1DD8-125C-44B0-95FF-C288EECB06E0@apple.com> <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> Message-ID: On Sep 3, 2010, at 6:40 PM, Kenneth Russell wrote: > On Fri, Sep 3, 2010 at 10:16 AM, Chris Marrin wrote: >> >> I've revised the event section (5.14). Please review. > > Sorry for not realizing this before, but the "NOT_AVAILABLE" status > code seems pretty useless, because if the web browser is so old that > it doesn't support WebGL, then it definitely won't support delivery of > the webglcontextcreationerror event. > > If you buy that argument, then the only two status codes are > NOT_SUPPORTED and OTHER_ERROR, and I'd argue that nobody is going to > take programmatic action based on one or the other. Therefore we could > perhaps dump the status codes completely and just leave the status > message. Maybe. I was hoping we could come up with more status codes though. It's nicer to have codes to respond to than just a message which doesn't have any defined format. My thought about the difference between NOT_AVAILABLE and NOT_SUPPORTED was supposed to be that the former indicated that the browser was not allowing you to use WebGL, while the latter said that your hardware just could do it. It allows 2 different messages to be displayed: "your browser does not allow WebGL content, try enabling it" and "your hardware doesn't allow WebGL, get better hardware". ----- ~Chris cmarrin...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Sat Sep 4 06:57:52 2010 From: cma...@ (Chris Marrin) Date: Sat, 04 Sep 2010 06:57:52 -0700 Subject: [Public WebGL] Proposed change to WebGL Event definition In-Reply-To: References: <33F15A1B-FF87-44F1-ABA3-632494FA649A@apple.com> <817D1DD8-125C-44B0-95FF-C288EECB06E0@apple.com> <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> Message-ID: <044681C5-76B1-4D4B-AF5C-4BD34203350B@apple.com> On Sep 3, 2010, at 10:08 PM, Cedric Vivier wrote: > On Sat, Sep 4, 2010 at 09:40, Kenneth Russell wrote: > On Fri, Sep 3, 2010 at 10:16 AM, Chris Marrin wrote: > > > > I've revised the event section (5.14). Please review. > > Sorry for not realizing this before, but the "NOT_AVAILABLE" status > code seems pretty useless, because if the web browser is so old that > it doesn't support WebGL, then it definitely won't support delivery of > the webglcontextcreationerror event. > > > NOT_AVAILABLE "WebGL is not implemented or not enabled in this browser" > > I think the current meaning is confusing (and indeed useless). > IIRC the original planned intent for this status code, meaning should be something like : > > "WebGL is currently disabled or temporarily unavailable because of high resource usage by other tabs/programs." I think that's another status code. Perhaps reasons for failure are: - WebGL is not implemented (the event will never fire) - WebGL is disabled by the browser - WebGL is disabled by request of the user - Hardware is insufficient for running WebGL content - System is unable to run WebGL content because of other system constraints. Maybe it is a bad idea to have status codes. There could be many more reasons for not being able to use WebGL content. So maybe we should do as Ken suggests and only have a status message. That way the user agent can put any necessary recommendations into the string. The browsers would have L11N issues, but that is probably manageable. The unfortunate thing is that, if you don't have WebGL support at all, this event will never fire. You will simply get a null return from getContext(). You' have to set up a timer and if it fires without any event being generated you know you don't have WebGL. That's a hack. I think the right way to handle that is to add a media query for WebGL. We've made a proposal to add media queries for CSS animations, transitions, and transforms. These are all supported in WebKit today. It would be easy to add one for 'webgl'. You can run the query from JS and know you don't have WebGL without having to call getContext(). But the really nice thing about media queries is that you can use them in CSS style sheets. If WebGL is missing you can style the page differently for, for instance, not take up the space for the WebGL canvas. So the proposal is to get rid of statusCode and add a media query for 'webgl'. I will talk to dino, who is pushing the media query extensions spec, about this as well. ----- ~Chris cmarrin...@ -------------- next part -------------- An HTML attachment was scrubbed... URL: From cma...@ Sat Sep 4 08:40:25 2010 From: cma...@ (Chris Marrin) Date: Sat, 04 Sep 2010 08:40:25 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> Message-ID: On Sep 3, 2010, at 9:54 PM, Cedric Vivier wrote: > On Fri, Sep 3, 2010 at 23:51, Chris Marrin wrote: > AFAIK, gamma correction is done to make images look right on the selected display. It has nothing to do with data in the source image. I believe some images might have color correction information in them, but that's different from gamma correction. > > I think this contradicts the related paragraph in the canvas 2D context spec : > http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html#color-spaces-and-color-correction > > Canvas 2D is clearly supposed to perform gamma correction only on images that have their own color correction information, I assume WebGL should only do color/gamma correction when unpacking textures under the same rule. > This would actually render the UNPACK_* parameter almost useless as it could (and probably should) be the default. If developers do not want gamma correction they just have to use images without color correction information in them (which would already be the case for any non-diffuse texture anyways). I'd really like to avoid the term "gamma correction" because I don't think it's correct. It's a term used to describe a color space conversion used to adapt to the nonlinearities of displays. That correction will happen whether we want it to or not, after we place pixels into the WebGL canvas. I think Ollie's picture is correct, and is the concept used by the 2D canvas. You get a chance to do color space conversion of incoming images, and again as the canvas is composited. I hope we are only talking about the former. I don't think we should be giving the option of changing how color space conversion is done in the compositor. We should simply define what the color space of the WebGL canvas is. I believe we have two reasonable choices for the format in the canvas: sRGB, which is what the 2D Canvas uses, and linear. With sRGB, we match what the 2D canvas does. But it seems like using that would cause issues when combining pixels with alpha blending etc. So maybe a linear color space is better. Converting between linear and sRGB is easy. If the compositor expects sRGB and our canvas is linear, we just need to do a gamma function to convert it (apply a gamma of 2.2 according to one website). I believe the default image format should match the canvas format. If we choose a linear canvas then images should be linear. If the incoming image is sRGB, we need to convert it. Again, going from sRGB to linear is a simple conversion. One final issue is what color space pixels are in when they are read back, either with toDataURL() or readPixels(). This issue also appears indirectly when using HTMLCanvasElement with WebGL content as the source for a 2D Canvas drawImage() call. It would be really nice to match what 2D does just to make all these issues simpler. If the WebGL canvas is sRGB, then it composites the same as 2D Canvas, toDataURL() works the same, and readPixels() returns sRGB, which is what the 2D Canvas getImage() call returns. Does doing that complicate the rendering? ----- ~Chris cmarrin...@ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ste...@ Sat Sep 4 11:09:46 2010 From: ste...@ (Steve Baker) Date: Sat, 04 Sep 2010 13:09:46 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> Message-ID: <4C828B6A.3040501@sjbaker.org> So can we agree on this? 1) The WebGL color space shall be clearly defined to be a linear color space. This is essential for things like cross-platform shader code compatibility - and it's what all GPU's do anyway so it's no extra imposition. 2) Textures that are loaded into WebGL have an *optional* conversion from the color space of the image file into linear color space and where the color space of the file is ill-defined, it shall be assumed to be sRGB with a gamma of 2.2. This implies a need to reverse-gamma correct formats like JPEG and some careful reading of the PNG and GIF specifications to see how the color spaces of those files are described. But no matter what, we allow the application to disable this conversion on a per-file basis. 3) It is essential for WebGL applications to be able to render an image in linear color space and to subsequently use that image as a linear color space texture with no additional processing steps. There has to be a high-efficiency, zero-messing-around-with-my-data path for render-to-texture. Since we're going from linear to linear color spaces, that's not a tough proposition. 4) There is an *optional* color space conversion step when reading back canvas data into WebGL as a texture if the canvas is not already in a linear color space. Since WebGL canvasses and textures are always linear, this cannot (by definition) interfere with (3). But it may result in sRGB to linear conversions when reading back other kinds of canvas images...unless the application disables that. 5) Final color space conversion of a WebGL canvas to the *device* color space is a clearly specified *non-optional* requirement. This processing happens in a manner that never interferes with (3) or (4). Gamma correction happens in the compositor - or if we're printing the page. The gamma will probably be nailed at 2.2, but it could be something that the end user might want to adjust. For printing, this color space conversion might even be into CMYK - but the point is that the application is oblivious of this. 6) Steve shall endeavor not to get so outraged about such things in the future. ...and especially, to avoid upsetting Chris...sorry! I think that covers all the bases. -- Steve Chris Marrin wrote: > > On Sep 3, 2010, at 9:54 PM, Cedric Vivier wrote: > >> On Fri, Sep 3, 2010 at 23:51, Chris Marrin > > wrote: >> >> AFAIK, gamma correction is done to make images look right on the >> selected display. It has nothing to do with data in the source >> image. I believe some images might have color correction >> information in them, but that's different from gamma correction. >> >> >> I think this contradicts the related paragraph in the canvas 2D >> context spec : >> http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html#color-spaces-and-color-correction >> >> Canvas 2D is clearly supposed to perform gamma correction only on >> images that have their own color correction information, I assume >> WebGL should only do color/gamma correction when unpacking textures >> under the same rule. >> This would actually render the UNPACK_* parameter almost useless as >> it could (and probably should) be the default. If developers do not >> want gamma correction they just have to use images without color >> correction information in them (which would already be the case for >> any non-diffuse texture anyways). > > I'd really like to avoid the term "gamma correction" because I don't > think it's correct. It's a term used to describe a color space > conversion used to adapt to the nonlinearities of displays. That > correction will happen whether we want it to or not, after we place > pixels into the WebGL canvas. I think Ollie's picture is correct, and > is the concept used by the 2D canvas. > > You get a chance to do color space conversion of incoming images, and > again as the canvas is composited. I hope we are only talking about > the former. I don't think we should be giving the option of changing > how color space conversion is done in the compositor. We should simply > define what the color space of the WebGL canvas is. I believe we have > two reasonable choices for the format in the canvas: sRGB, which is > what the 2D Canvas uses, and linear. With sRGB, we match what the 2D > canvas does. But it seems like using that would cause issues when > combining pixels with alpha blending etc. So maybe a linear color > space is better. > > Converting between linear and sRGB is easy. If the compositor expects > sRGB and our canvas is linear, we just need to do a gamma function to > convert it (apply a gamma of 2.2 according to one website). > > I believe the default image format should match the canvas format. If > we choose a linear canvas then images should be linear. If the > incoming image is sRGB, we need to convert it. Again, going from sRGB > to linear is a simple conversion. > > One final issue is what color space pixels are in when they are read > back, either with toDataURL() or readPixels(). This issue also appears > indirectly when using HTMLCanvasElement with WebGL content as the > source for a 2D Canvas drawImage() call. > > It would be really nice to match what 2D does just to make all these > issues simpler. If the WebGL canvas is sRGB, then it composites the > same as 2D Canvas, toDataURL() works the same, and readPixels() > returns sRGB, which is what the 2D Canvas getImage() call returns. > Does doing that complicate the rendering? > > ----- > ~Chris > cmarrin...@ > > > > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From oli...@ Sat Sep 4 12:02:37 2010 From: oli...@ (Oliver Hunt) Date: Sat, 4 Sep 2010 12:02:37 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C828B6A.3040501@sjbaker.org> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> Message-ID: <544A72DE-A0C6-41CC-90F2-75D57B75ED46@apple.com> On Sep 4, 2010, at 11:09 AM, Steve Baker wrote: > 2) Textures that are loaded into WebGL have an *optional* conversion > from the color space of the image file into linear color space and where > the color space of the file is ill-defined, it shall be assumed to be > sRGB with a gamma of 2.2. > This implies a need to reverse-gamma correct formats like JPEG and some > careful reading of the PNG and GIF specifications to see how the color > spaces of those files are described. But no matter what, we allow the > application to disable this conversion on a per-file basis. Gamma is not a separate property -- it is part of the colour profile, when we say sRGB that colour profile is the entirety of the information necessary -- if gamma were not a feature of the colour profile it would be possible to request something like "Linear RGB with a gamma of 2.2" which is clearly not sensible :D I recommend that we drop references to "gamma" as it only confuses things. I also believe colour matching of textures should be the default, with an optional override. > 3) It is essential for WebGL applications to be able to render an image > in linear color space and to subsequently use that image as a linear > color space texture with no additional processing steps. > There has to be a high-efficiency, zero-messing-around-with-my-data path > for render-to-texture. Since we're going from linear to linear color > spaces, that's not a tough proposition. This is purely an implementation detail and doesn't warrant being in the spec. Any implementation that does not do this efficiently is going to appear slower than other implementations and hopefully it's apparent that UA vendors don't like appearing to be slower than other vendors. > > 4) There is an *optional* color space conversion step when reading back > canvas data into WebGL as a texture if the canvas is not already in a > linear color space. > Since WebGL canvasses and textures are always linear, this cannot (by > definition) interfere with (3). But it may result in sRGB to linear > conversions when reading back other kinds of canvas images...unless the > application disables that. I'm not sure what you mean here. > > 5) Final color space conversion of a WebGL canvas to the *device* color > space is a clearly specified *non-optional* requirement. This > processing happens in a manner that never interferes with (3) or (4). > Gamma correction happens in the compositor - or if we're printing the > page. The gamma will probably be nailed at 2.2, but it could be > something that the end user might want to adjust. For printing, this > color space conversion might even be into CMYK - but the point is that > the application is oblivious of this. This also doesn't need to be in the spec. --Oliver ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From tu...@ Sat Sep 4 15:38:17 2010 From: tu...@ (Thatcher Ulrich) Date: Sat, 4 Sep 2010 18:38:17 -0400 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C828B6A.3040501@sjbaker.org> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> Message-ID: On Sat, Sep 4, 2010 at 2:09 PM, Steve Baker wrote: > So can we agree on this? > > 1) The WebGL color space shall be clearly defined to be a linear color > space. > This is essential for things like cross-platform shader code > compatibility - and it's what all GPU's do anyway so it's no extra > imposition. Hm. I'm uneasy about this. AFAIK OpenGL doesn't mandate a color space. By default it defines its filtering operations *as if* everything is linear, but in practice, for framebuffer formats with 8-bit color components, the output is generally treated as sRGB. That means the typical colorspace-ignorant app is taking sRGB input data, filtering it as if it is linear, and putting the results in an sRGB framebuffer to be displayed on an sRGB monitor. Status quo. If you care about doing your lighting and filtering in true linear space, then you need to take some special measures (i.e. figure out how to get linear data out of textures with acceptable fidelity, and do a final linear-to-sRGB step when outputting to a framebuffer with 8-bit components). WebGL should not diverge from OpenGL here -- the same special measures should apply to both WebGL and OpenGL. WebGL definitely should not do any automatic or default color space conversion that OpenGL doesn't do. In particular, converting from 8-bit-sRGB to 8-bit-linearRGB loses fidelity, so it's not something that should ever be done without somebody asking for it specifically. I believe EXT_texture_sRGB is designed to make it easier to be "linear correct" under these circumstances. It basically allows you to tell OpenGL that your 8-bit texture data is in sRGB format, and that you want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion when sampling the data from a shader. Your shader will operate on linear data (in floating point format), and the frame buffer is still your shader's problem. There are two important things for WebGL to do: A) Clearly specify how the output is being treated, and 2d-canvas leads the way here. The output is sRGB. Everything after that is the browser's problem. B) Implement EXT_texture_sRGB sometime soon. (IMO doesn't need to be in 1.0) I disagree with some of the other things Steve has said or implied, but I'll wait to elaborate until I hear what I got wrong above :) -T > 2) Textures that are loaded into WebGL have an *optional* conversion > from the color space of the image file into linear color space and where > the color space of the file is ill-defined, it shall be assumed to be > sRGB with a gamma of 2.2. > This implies a need to reverse-gamma correct formats like JPEG and some > careful reading of the PNG and GIF specifications to see how the color > spaces of those files are described. ?But no matter what, we allow the > application to disable this conversion on a per-file basis. > > 3) It is essential for WebGL applications to be able to render an image > in linear color space and to subsequently use that image as a linear > color space texture with no additional processing steps. > There has to be a high-efficiency, zero-messing-around-with-my-data path > for render-to-texture. ?Since we're going from linear to linear color > spaces, that's not a tough proposition. > > 4) There is an *optional* color space conversion step when reading back > canvas data into WebGL as a texture if the canvas is not already in a > linear color space. > Since WebGL canvasses and textures are always linear, this cannot (by > definition) interfere with (3). ?But it may result in sRGB to linear > conversions when reading back other kinds of canvas images...unless the > application disables that. > > 5) Final color space conversion of a WebGL canvas to the *device* color > space is a clearly specified *non-optional* requirement. ?This > processing happens in a manner that never interferes with (3) or (4). > Gamma correction happens in the compositor - or if we're printing the > page. ?The gamma will probably be nailed at 2.2, but it could be > something that the end user might want to adjust. ?For printing, this > color space conversion might even be into CMYK - but the point is that > the application is oblivious of this. > > 6) Steve shall endeavor not to get so outraged about such things in the > future. > ...and especially, to avoid upsetting Chris...sorry! > > I think that covers all the bases. > > ?-- Steve > > > Chris Marrin wrote: >> >> On Sep 3, 2010, at 9:54 PM, Cedric Vivier wrote: >> >>> On Fri, Sep 3, 2010 at 23:51, Chris Marrin >> > wrote: >>> >>> ? ? AFAIK, gamma correction is done to make images look right on the >>> ? ? selected display. It has nothing to do with data in the source >>> ? ? image. I believe some images might have color correction >>> ? ? information in them, but that's different from gamma correction. >>> >>> >>> I think this contradicts the related paragraph in the canvas 2D >>> context spec : >>> http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html#color-spaces-and-color-correction >>> >>> Canvas 2D is clearly supposed to perform gamma correction only on >>> images that have their own color correction information, I assume >>> WebGL should only do color/gamma correction when unpacking textures >>> under the same rule. >>> This would actually render the UNPACK_* parameter almost useless as >>> it could (and probably should) be the default. If developers do not >>> want gamma correction they just have to use images without color >>> correction information in them (which would already be the case for >>> any non-diffuse texture anyways). >> >> I'd really like to avoid the term "gamma correction" because I don't >> think it's correct. It's a term used to describe a color space >> conversion used to adapt to the nonlinearities of displays. That >> correction will happen whether we want it to or not, after we place >> pixels into the WebGL canvas. I think Ollie's picture is correct, and >> is the concept used by the 2D canvas. >> >> You get a chance to do color space conversion of incoming images, and >> again as the canvas is composited. I hope we are only talking about >> the former. I don't think we should be giving the option of changing >> how color space conversion is done in the compositor. We should simply >> define what the color space of the WebGL canvas is. I believe we have >> two reasonable choices for the format in the canvas: sRGB, which is >> what the 2D Canvas uses, and linear. With sRGB, we match what the 2D >> canvas does. But it seems like using that would cause issues when >> combining pixels with alpha blending etc. So maybe a linear color >> space is better. >> >> Converting between linear and sRGB is easy. If the compositor expects >> sRGB and our canvas is linear, we just need to do a gamma function to >> convert it (apply a gamma of 2.2 according to one website). >> >> I believe the default image format should match the canvas format. If >> we choose a linear canvas then images should be linear. If the >> incoming image is sRGB, we need to convert it. Again, going from sRGB >> to linear is a simple conversion. >> >> One final issue is what color space pixels are in when they are read >> back, either with toDataURL() or readPixels(). This issue also appears >> indirectly when using HTMLCanvasElement with WebGL content as the >> source for a 2D Canvas drawImage() call. >> >> It would be really nice to match what 2D does just to make all these >> issues simpler. If the WebGL canvas is sRGB, then it composites the >> same as 2D Canvas, toDataURL() works the same, and readPixels() >> returns sRGB, which is what the 2D Canvas getImage() call returns. >> Does doing that complicate the rendering? >> >> ----- >> ~Chris >> cmarrin...@ >> >> >> >> > > ----------------------------------------------------------- > You are currently subscribed to public_webgl...@ > To unsubscribe, send an email to majordomo...@ with > the following command in the body of your email: > > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Sat Sep 4 18:45:12 2010 From: ste...@ (Steve Baker) Date: Sat, 04 Sep 2010 20:45:12 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> Message-ID: <4C82F628.1030906@sjbaker.org> Thatcher Ulrich wrote: > On Sat, Sep 4, 2010 at 2:09 PM, Steve Baker wrote: > >> So can we agree on this? >> >> 1) The WebGL color space shall be clearly defined to be a linear color >> space. >> This is essential for things like cross-platform shader code >> compatibility - and it's what all GPU's do anyway so it's no extra >> imposition. >> > > Hm. I'm uneasy about this. AFAIK OpenGL doesn't mandate a color > space. By default it defines its filtering operations *as if* > everything is linear, but in practice, for framebuffer formats with > 8-bit color components, the output is generally treated as sRGB. That > means the typical colorspace-ignorant app is taking sRGB input data, > filtering it as if it is linear, and putting the results in an sRGB > framebuffer to be displayed on an sRGB monitor. Status quo. > > If you care about doing your lighting and filtering in true linear > space, then you need to take some special measures (i.e. figure out > how to get linear data out of textures with acceptable fidelity, and > do a final linear-to-sRGB step when outputting to a framebuffer with > 8-bit components). WebGL should not diverge from OpenGL here -- the > same special measures should apply to both WebGL and OpenGL. > OpenGL can afford not to explicitly state the color space because it steadfastly does not involve itself with how files are loaded or how the windowing system treats the output. It does such-and-such arithmetic on such-and-such data without ever saying what the data is. (Although some parts of the spec make a very strong implication that it's linear - and most of the examples in the RedBook and OrangeBook rely on that being true). /*HOWEVER:*/ WebGL includes file loading operations and ties itself very firmly to one particular output mechanism (the canvas system) which in turn ties down the nature of output operations. By adding that into the OpenGL spec, we've opened this particular can of worms...hence this entire issue. What I'm advocating doesn't prevent people from being blissfully ignorant of all of the gamma/color-space issues - they can turn off all of the file loader color space conversion and the consequences will be very much what they are if you don't take care to do it right in OpenGL on the desktop. You get kinda crappy graphics that you can sorta get away with unless you're trying to make a really accurate simulation or a gorgeous-looking AAA games title. It's the difference between winding up with something that looks like Mario64 - or something that looks like Red Dead Redemption. I agree that there are undoubtedly going to be issues with roundoff error and precision. In the short term, I'll do what I've always done - instructed my artists to paint textures in 'linear' color space and to have their monitors gamma color-calibrated every six months to ensure that they're seeing those textures optimally. That's what professional graphics organizations do. One day, we'll finally see the end of 16 bit colors and less use of 24 bit colors - and start to use 36, 48 or even floating point color for high dynamic range rendering (I do this already in some applications). Since WebGL will still be around then, we need to get the specification right now. HDR lighting shows up these kinds of problems in sharp relief - and almost all 'realistic' 3D games use it - it's the way of the future for sure. When colors lie outside the 0..1 range, you really can't do things with sRGB. So this is important and we can't just dismiss the matter by saying - "Well, OpenGL doesn't do it". OpenGL doesn't have to do it...we do. > WebGL definitely should not do any automatic or default color space > conversion that OpenGL doesn't do. In particular, converting from > 8-bit-sRGB to 8-bit-linearRGB loses fidelity, so it's not something > that should ever be done without somebody asking for it specifically. > OpenGL doesn't do it because it doesn't load files. WebGL must do it because it provides file loaders to the application. I agree that we mustn't force people to accept conversions - I've been very careful to say that they must be optional. I would actually prefer that they were disabled by default - but I can live with having them enabled by default if that's the consensus here. Perhaps the best thing is not to have a default at all - to require a non-optional token in the file loader command that says either "I want color-space conversions" or "I don't want conversions". If there is no default, everyone has to think carefully about what they want. But I don't particularly care. So long as I can choose to turn them off, I'm happy. > I believe EXT_texture_sRGB is designed to make it easier to be "linear > correct" under these circumstances. It basically allows you to tell > OpenGL that your 8-bit texture data is in sRGB format, and that you > want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion > when sampling the data from a shader. Your shader will operate on > linear data (in floating point format), and the frame buffer is still > your shader's problem. > Yes - but it's an extension that we can't rely on always being implemented (I doubt ANGLE could emulate it either). Also the extension is poorly written - it gives the implementation the choice to do interpolation linearly instead of 'correctly' for sRGB - and provides no way for the application to tell whether it's done right or not. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From tu...@ Sun Sep 5 03:56:09 2010 From: tu...@ (Thatcher Ulrich) Date: Sun, 5 Sep 2010 06:56:09 -0400 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C82F628.1030906@sjbaker.org> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> Message-ID: On Sat, Sep 4, 2010 at 9:45 PM, Steve Baker wrote: > > I agree that there are undoubtedly going to be issues with roundoff > error and precision. ?In the short term, I'll do what I've always done - > instructed my artists to paint textures in 'linear' color space and to > have their monitors gamma color-calibrated every six months to ensure > that they're seeing those textures optimally. What format & depth are your artists saving their work in? And then what texture format(s) do you convert to? How are you planning to get this data into WebGL? There are some relevant constraints on WebGL: * browsers (currently) work with 8-bit sRGB_Alpha color buffers, so that's the format that WebGL output ends up in. I don't think WebGL 1.0 can realistically spec anything else for output. (In the future, perhaps browsers will be able to handle linear color spaces at higher bit depths, but I don't know if anybody is seriously considering that yet, so I doubt there's any point in spec'ing anything in WebGL 1.0.) * WebGL has texImage2D calls that take raw buffers of data. Currently (default) behavior passes this data straight to the texture formats, so any automatic gamma treatment would be a change. * browsers can load image formats in PNG and JPEG, which are most typically 8-bit sRGB. WebGL behavior when using these formats is definitely worth spec'ing. * PNG has the ability to specify a per image gamma value (the gAMA thing referenced earlier). Browsers appear to handle this differently. see this reference page, at "Images with gamma chunks": http://www.libpng.org/pub/png/pngsuite.html On the Mac I'm using right now, Chrome and Safari do not do gamma correction, while Firefox does. You can also clearly see the quantization errors in the Firefox images with the lower gamma values. The Chrome and Safari behavior is (arguably) a bug. * PNG has the ability to store 16-bit color depth. However, my understanding is that current browsers take all input images (including PNG images with 16-bit color depth) and convert them to sRGB_Alpha internally, before WebGL has a chance to see the data. Also, the WebGL spec does not appear to have any texture formats that have more than 8 bits per color component. This would be a great thing to improve, post WebGL 1.0, since hi-fi WebGL apps could make good use of it. It seems to me there are two unresolved questions for WebGL 1.0 1) Should WebGL attempt to nail down how browsers are supposed to handle PNG's with a non-default gAMA value? Viable options here are: a) leave it up to the browser (status quo, but behavior may differ among browser; in practice apps will have to supply sRGB data and do any additional conversions themselves). b) demand conversion to sRGB_Alpha based on PNG metadata (i.e. converge on current Firefox behavior, however non-sRGB behavior will be a corner case for browsers and smart WebGL developers may opt to always supply sRGB data and do any conversions themselves) c) demand passing raw data straight through. 16-bit components would be rounded or truncated to 8-bit. (i.e. converge on current WebKit behavior, similar caveats as option b) 2) Should WebGL add a PixelStore option that does some kind of gamma conversion? (Where the thread started.) IMO the status quo (do nothing) is pretty much fine. Apps that want a specific interpretation of their image data can either pre-process it so the raw data matches what they want in texture RAM, or else do custom processing via GPU with the existing facilities. >> I believe EXT_texture_sRGB is designed to make it easier to be "linear >> correct" under these circumstances. ?It basically allows you to tell >> OpenGL that your 8-bit texture data is in sRGB format, and that you >> want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion >> when sampling the data from a shader. ?Your shader will operate on >> linear data (in floating point format), and the frame buffer is still >> your shader's problem. >> > Yes - but it's an extension that we can't rely on always being > implemented (I doubt ANGLE could emulate it either). It can be trivially implemented on any hardware that has an internal texture format with at least 12 bits per component; just convert the data to linear and store in the higher-depth format. The practical problem is that it wastes texture RAM, hence the preference to have lookup tables in the GPU. -T ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Sun Sep 5 04:51:17 2010 From: ste...@ (stephen white) Date: Sun, 5 Sep 2010 21:21:17 +0930 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> Message-ID: <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> On 05/09/2010, at 8:26 PM, Thatcher Ulrich wrote: > It seems to me there are two unresolved questions for WebGL 1.0 I'm not sure that my point was understood, so I'll try again... When a browser composites an image onto the page, that image is the entire block of pixels. The browser can have simple rules because of that. When WebGL draws to its canvas, it is using a number of images within the block of pixels. Therefore the number of different images may have different colour spaces and/or gammas. The additional complexity is coming from multiple images used together, which is not a problem that simply displaying an image had to worry about. As far as I can work this out, Steve Baker's suggestion of an option to reverse-map back to a linear colour space is the only mathematically valid option. In practical terms, this would work out to a "WEBGL_MAP_TO_LINEAR" loading time option that doesn't do any work for linear PNGs but does fix up colour space JPGs. If the option isn't set, then the JPGs are not touched and it's up to the programmer to do what they think is best. -- steve...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Sun Sep 5 07:31:25 2010 From: cma...@ (Chris Marrin) Date: Sun, 05 Sep 2010 07:31:25 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> Message-ID: <00FC31AB-675C-4705-8780-AD7D210532A8@apple.com> On Sep 5, 2010, at 4:51 AM, stephen white wrote: > On 05/09/2010, at 8:26 PM, Thatcher Ulrich wrote: >> It seems to me there are two unresolved questions for WebGL 1.0 > > > I'm not sure that my point was understood, so I'll try again... > > When a browser composites an image onto the page, that image is the entire block of pixels. The browser can have simple rules because of that. > > When WebGL draws to its canvas, it is using a number of images within the block of pixels. Therefore the number of different images may have different colour spaces and/or gammas. > > The additional complexity is coming from multiple images used together, which is not a problem that simply displaying an image had to worry about. > > As far as I can work this out, Steve Baker's suggestion of an option to reverse-map back to a linear colour space is the only mathematically valid option. > > In practical terms, this would work out to a "WEBGL_MAP_TO_LINEAR" loading time option that doesn't do any work for linear PNGs but does fix up colour space JPGs. > > If the option isn't set, then the JPGs are not touched and it's up to the programmer to do what they think is best. It's true that native OpenGL apps can make any decisions they want about the color space of incoming images, the represented color space of the drawing buffer, and the conversions done to the drawing buffer on display. But we have to be well defined on all aspects of color space management. That doesn't mean we have to be "correct" (if there is such a thing). We just have to be consistent. Given the text in the http://www.opengl.org/registry/specs/EXT/texture_sRGB.txt, it seems as though (at least most recently) OpenGL is assuming today's images are coming in linear. I think that's a bad assumption since, as Ollie points out, WebKit assumes JPEG images without a color profile to be sRGB. But I'm ready to accept that the should break with the 2D Canvas' tradition of representing the canvas as sRGB and maintain the WebGL canvas as linear. I'd like to get the discussion more focused. We have to decide several things: 1) What color space is the WebGL canvas? As I said, I agree that the drawing buffer should be considered to be linear. This implies that the drawing buffer will have to be color corrected when composited with the rest of the HTML page. This also implies that, when using a WebGL canvas as the source image for a 2D Canvas drawImage() operation, it has to be color corrected into sRGB space, since that's what 2D Canvas expects. 2) What is the default incoming color space of images as textures, and what are the options to change that? I believe the choices are: unchanged (I will call that raw), linear and sRGB. I think we need to support raw and linear.If an when we support EXT_texture_sRGB, we will get the ability to support incoming sRGB images. This decision really concerns me. WebKit considers images without a color profile to be sRGB, but OpenGL (by statements in the EXT_texture_sRGB spec) assumes images without a profile to be linear. Which assumption do we make? Do we color correct all incoming images into linear space (assuming they have all been color corrected into sRGB already)? Or do we bring in images without a color profile unchanged, and only correct images that have a defined color profile. If we do the former, we will have different visual results than the corresponding native OpenGL program. If we do the latter, we will get results that are inconsistent with the rest of the HTML page. 3) What is the color space of pixels coming out of the drawing buffer via toDataURL() and readPixels()? For consistency, I think toDataURL() should be color corrected into sRGB. But it seems more reasonable to leave the results of readPixels() in the same linear color space of the drawing buffer. This would have to be clearly spelled out. ----- ~Chris cmarrin...@ ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Sun Sep 5 07:46:16 2010 From: ste...@ (Steve Baker) Date: Sun, 05 Sep 2010 09:46:16 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> Message-ID: <4C83AD38.5050404@sjbaker.org> Thatcher Ulrich wrote: > On Sat, Sep 4, 2010 at 9:45 PM, Steve Baker wrote: > >> I agree that there are undoubtedly going to be issues with roundoff >> error and precision. In the short term, I'll do what I've always done - >> instructed my artists to paint textures in 'linear' color space and to >> have their monitors gamma color-calibrated every six months to ensure >> that they're seeing those textures optimally. >> > > What format & depth are your artists saving their work in? And then > what texture format(s) do you convert to? How are you planning to get > this data into WebGL? > Well - I have to answer this two ways - because I'm doing two jobs. * In my (paying) day job - I am the graphics lead for Total Immersion (www.totimm.com) - but I saw the same practices when I worked for Midway Games and in many games & simulation companies before that. I'm writing D3D (yuk!) graphics engines for 'serious games' for things like training firefighters and the pilots of unmanned drone aircraft and such. The artists use 8/8/8 or 8/8/8/8 PNG or TIFF (some GIS terrain tools use it) with a gamma of 1.0 (ie, linear color space) and we convert to DDS for loading into the game. DDS supports DXT1/3/5 compression and no-compression which gives us the choice to use lossy compression where space is critical. We also have the advantage of stating 'minimum hardware specs' which means we never have to deal with cellphones or hardware that doesn't support full floating point textures and shaders. One day we'd LOVE to be able to use WebGL to support this stuff - but we're far from there yet. Part of the reason I'm following this mailing list is that I want to be sure that WebGL could ultimately support serious simulation engines and run AAA quality video games. * In my spare time (hahahah!) I'm starting with a small, dedicated (and as yet unpaid) team to put together an experimental set of WebGL-based games to see whether we can make money using adverts and T-shirt sales and in-game revenue (pay us a dollar and get that neat weapon you always wanted!). We currently use 8/8/8 and 8/8/8/8 PNG with a gamma of 1.0 (ie linear color space) and do no further compression. Having looked carefully at ETC1, and kicked around some ideas with the guy at Ericsson who invented it, it's clear that it may prove useful for some kinds of data - but I'm still a little skeptical about it in general. There may be some super-devious shaderly tricks to make it do things it was never designed to do...but that's still a matter of investigation for me. In the latter case, my biggest concern is with platforms that may not support 8/8/8 or 8/8/8/8 formats internally and which may crunch them down to 5/6/5 or 5/5/5/1 or 4/4/4/4 formats internally. Since I use texture in 'non-traditional' ways - I'm having to get creative about only using the high order 4 bits in some situations. It's painful. So my best guess right now is that I'll be using the built-in file loaders with linear color space PNG. > There are some relevant constraints on WebGL: > > * browsers (currently) work with 8-bit sRGB_Alpha color buffers, so > that's the format that WebGL output ends up in. I don't think WebGL > 1.0 can realistically spec anything else for output. (In the future, > perhaps browsers will be able to handle linear color spaces at higher > bit depths, but I don't know if anybody is seriously considering that > yet, so I doubt there's any point in spec'ing anything in WebGL 1.0.) > That's great. > * WebGL has texImage2D calls that take raw buffers of data. Currently > (default) behavior passes this data straight to the texture formats, > so any automatic gamma treatment would be a change. > Yes - that is true. I think we need the specification to say that you CAN automatically convert color spaces when you do this - but not that you MUST. That's necessary, not least because with low end cellphone hardware, repeatedly converting back and forth between color spaces will introduce so much rounding error that it'll be unusable - so applications will want to disable that (even if it's "mathematically required") in order to get the lesser of two evils. However, IMHO, the specification needs to support (at least in theory) mathematical correctness and must never impos**e incorrectness because by the time this specification becomes obsolete, we'll probably have 12 or 16 bit integer and 16 or 32 bit floating point per component (I already use 32/32/32/32 for HDR lighting in some places in my "day job" graphics engine) and doing this conversion will ultimately be entirely painless. > * browsers can load image formats in PNG and JPEG, which are most > typically 8-bit sRGB. WebGL behavior when using these formats is > definitely worth spec'ing. > Yes. > * PNG has the ability to specify a per image gamma value (the gAMA > thing referenced earlier). Browsers appear to handle this > differently. see this reference page, at "Images with gamma chunks": > http://www.libpng.org/pub/png/pngsuite.html On the Mac I'm using > right now, Chrome and Safari do not do gamma correction, while Firefox > does. You can also clearly see the quantization errors in the Firefox > images with the lower gamma values. The Chrome and Safari behavior is > (arguably) a bug. > Yes, I agree. > * PNG has the ability to store 16-bit color depth. However, my > understanding is that current browsers take all input images > (including PNG images with 16-bit color depth) and convert them to > sRGB_Alpha internally, before WebGL has a chance to see the data. > Also, the WebGL spec does not appear to have any texture formats that > have more than 8 bits per color component. This would be a great > thing to improve, post WebGL 1.0, since hi-fi WebGL apps could make > good use of it. > Absolutely. Supporting (in particular) floating point textures would be a big win...but there are many desktop/laptop chipsets that can't do that - and I fear it'll be a good few years before cellphones can do that. But this is doable as an extension. The issues of color space correctness are the underpinnings of the specification and should be handled rigorously from the get-go because inserting them later would be tortuous and disruptive. > It seems to me there are two unresolved questions for WebGL 1.0 > > 1) Should WebGL attempt to nail down how browsers are supposed to > handle PNG's with a non-default gAMA value? Viable options here are: > > a) leave it up to the browser (status quo, but behavior may differ > among browser; in practice apps will have to supply sRGB data and do > any additional conversions themselves). > Not tolerable. Because textures are very often used for storing things other than images and writing our own texture loaders in JavaScript is ridiculous - doing any kind of automatic color space conversion without a way to turn it off would make WebGL useless for all but the simplest kinds of 3D graphics. Worse still, because JavaScript is slow, we have to push more work into the GPU than on a traditional OpenGL or D3D platform - that INCREASES the number of situations where we use texture "non-traditionally" in order to get good performance. If this were the choice - I'd stop work on my WebGL games. > b) demand conversion to sRGB_Alpha based on PNG metadata (i.e. > converge on current Firefox behavior, however non-sRGB behavior will > be a corner case for browsers and smart WebGL developers may opt to > always supply sRGB data and do any conversions themselves) > > c) demand passing raw data straight through. 16-bit components > would be rounded or truncated to 8-bit. (i.e. converge on current > WebKit behavior, similar caveats as option b) > d) Have the WebGL texture file loader convert whatever color space the file is in - to a uniform linear color space. With the option to turn that conversion off for files that contain non-traditional (non-image) data - and in cases where the roundoff error inherent in the conversion is not acceptable - or when the application knows that the source data is in linear color space regardless of what the file header happens to say. i.e. Support both and let the application decide. > 2) Should WebGL add a PixelStore option that does some kind of gamma > conversion? (Where the thread started.) IMO the status quo (do > nothing) is pretty much fine. Apps that want a specific > interpretation of their image data can either pre-process it so the > raw data matches what they want in texture RAM, or else do custom > processing via GPU with the existing facilities. > Since we've now established (I hope!) that the canvas spec requires that the linear color space WebGL canvas must be converted into 'device color space' at some point before it hit the screen (which for us probably means "it's gamma corrected in the compositor") - there is absolutely no reason to ever want to do this. Automatically converting linear-space textures into gamma space then converting the rendering results into gamma space would guarantee an ugly mess on the output. That's a waste of CPU time, implementation effort and it's mathematically indefensible. >>> I believe EXT_texture_sRGB is designed to make it easier to be "linear >>> correct" under these circumstances. It basically allows you to tell >>> OpenGL that your 8-bit texture data is in sRGB format, and that you >>> want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion >>> when sampling the data from a shader. Your shader will operate on >>> linear data (in floating point format), and the frame buffer is still >>> your shader's problem >>> Yes - but it's an extension that we can't rely on always being >>> implemented (I doubt ANGLE could emulate it either). >>> > > It can be trivially implemented on any hardware that has an internal > texture format with at least 12 bits per component; just convert the > data to linear and store in the higher-depth format. The practical > problem is that it wastes texture RAM, hence the preference to have > lookup tables in the GPU. > But we're potentially operating with hardware that may not even support 8 bits per component let alone 12! So it certainly cannot be "trivially implemented" on all WebGL clients. Hence we certainly can't rely on it - and most certainly we can't write our specification based upon it! Worse still, implementing it simply with "lookup tables in the GPU" (which, I'll grant that the extension spec allows) means that the sRGB texels are linearly interpolated for GL_LINEAR and GL_LINEAR_MIPMAP_LINEAR textures - and that's wrong! When you minify a texture, the hardware is (in effect) calculating the contributions of four texels from each of two MIP levels. In linear color space, that's a simple lerp operation that's super-cheap to do in hardware - but in sRGB, doing that that gives too much weight to the bright texels and not enough to the dark ones. The consequences are that the texture will alias along bright-to-dark transitions. I can't imagine many GPU manufacturers building proper sRGB interpolators into the highest bandwidth part of their engines - so I'll be surprised if many of them support this extension "the right way"...which makes it all but useless to people who want nice-looking graphics. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Sun Sep 5 08:31:55 2010 From: ste...@ (Steve Baker) Date: Sun, 05 Sep 2010 10:31:55 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <00FC31AB-675C-4705-8780-AD7D210532A8@apple.com> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> <00FC31AB-675C-4705-8780-AD7D210532A8@apple.com> Message-ID: <4C83B7EB.7000301@sjbaker.org> Chris Marrin wrote: > On Sep 5, 2010, at 4:51 AM, stephen white wrote: > >> On 05/09/2010, at 8:26 PM, Thatcher Ulrich wrote: >> >>> It seems to me there are two unresolved questions for WebGL 1.0 >>> >> I'm not sure that my point was understood, so I'll try again... >> >> When a browser composites an image onto the page, that image is the entire block of pixels. The browser can have simple rules because of that. >> >> When WebGL draws to its canvas, it is using a number of images within the block of pixels. Therefore the number of different images may have different colour spaces and/or gammas. >> >> The additional complexity is coming from multiple images used together, which is not a problem that simply displaying an image had to worry about. >> >> As far as I can work this out, Steve Baker's suggestion of an option to reverse-map back to a linear colour space is the only mathematically valid option. >> >> In practical terms, this would work out to a "WEBGL_MAP_TO_LINEAR" loading time option that doesn't do any work for linear PNGs but does fix up colour space JPGs. >> >> If the option isn't set, then the JPGs are not touched and it's up to the programmer to do what they think is best. >> > > It's true that native OpenGL apps can make any decisions they want about the color space of incoming images, the represented color space of the drawing buffer, and the conversions done to the drawing buffer on display. But we have to be well defined on all aspects of color space management. That doesn't mean we have to be "correct" (if there is such a thing). We just have to be consistent. > We have to be both consistent AND correct if we want pretty graphics...which (forgive me if I'm wrong!) is the ultimate goal here! 3D graphics is unforgiving of mathematical laxity. Most of the modern advances in graphics technology are in the removal of mathematical kludges made necessary by the primitive hardware that was around when OpenGL and D3D were first specified. (Consider, for example, doing per-pixel phong lighting and not per-vertex gouraud shading - or doing HDR). Sadly, we can't mandate correctness because some of WebGL's low-end client hardware is still pretty primitive - but we can (and I would argue "must") write a specification that doesn't mandate incorrectness. We hope that this specification will still be in active use 10 or 20 years from now...the core principles have to be mathematically sound or we'll be patching and kludging stuff well past my retirement date! > Given the text in the http://www.opengl.org/registry/specs/EXT/texture_sRGB.txt, it seems as though (at least most recently) OpenGL is assuming today's images are coming in linear. I think that's a bad assumption since, as Ollie points out, WebKit assumes JPEG images without a color profile to be sRGB. But I'm ready to accept that the should break with the 2D Canvas' tradition of representing the canvas as sRGB and maintain the WebGL canvas as linear. > > I'd like to get the discussion more focused. We have to decide several things: > > 1) What color space is the WebGL canvas? > > As I said, I agree that the drawing buffer should be considered to be linear. This implies that the drawing buffer will have to be color corrected when composited with the rest of the HTML page. This also implies that, when using a WebGL canvas as the source image for a 2D Canvas drawImage() operation, it has to be color corrected into sRGB space, since that's what 2D Canvas expects. > I don't see where the canvas spec says that canvasses are sRGB. I see that it says that you have to convert your canvas to "device color space" (which is likely to be sRGB) on output - but that's exactly what we're proposing here. So as far as I can tell, doing what you (and I) want here - isn't "breaking with the 2D canvas' tradition" - it's "following the canvas spec to the letter". Despite the sRGB extension - the GPU is still a linear-color space engine. I boldly predict that it'll never be otherwise. GLSL has operations like 'lerp' and would need some god-awful piece of mathematics to replace it in order to do the equivalent thing in sRGB. There is no arguing that point - there are not, nor ever have been sRGB color-space GPU's. The best you could argue would be "The errors in kludging sRGB through a linear hardware engine are acceptable" - and that's something that I'd fight tooth and nail because it's demonstrably not true. > 2) What is the default incoming color space of images as textures, and what are the options to change that? > > I believe the choices are: unchanged (I will call that raw), linear and sRGB. I think we need to support raw and linear.If an when we support EXT_texture_sRGB, we will get the ability to support incoming sRGB images. > Yes. > This decision really concerns me. WebKit considers images without a color profile to be sRGB, but OpenGL (by statements in the EXT_texture_sRGB spec) assumes images without a profile to be linear. Which assumption do we make? Do we color correct all incoming images into linear space (assuming they have all been color corrected into sRGB already)? Or do we bring in images without a color profile unchanged, and only correct images that have a defined color profile. If we do the former, we will have different visual results than the corresponding native OpenGL program. If we do the latter, we will get results that are inconsistent with the rest of the HTML page. > Why don't we just let the application writer choose. When you load an image into a texture, you say "RAW" (don't mess with my texels!) or "AUTOMATIC" (please convert my texels to linear-color-space (if necessary) according to whatever the file header says)...and in some future release "sRGB" (please convert my texels to sRGB color space (if necessary) according to whatever the file header says because I'm going to use the sRGB extension). I believe that all of the image file formats that we support have some kind of gamma value. We have all of the bases covered that way. > 3) What is the color space of pixels coming out of the drawing buffer via toDataURL() and readPixels()? > > For consistency, I think toDataURL() should be color corrected into sRGB. But it seems more reasonable to leave the results of readPixels() in the same linear color space of the drawing buffer. This would have to be clearly spelled out. > I think we should follow the same rules as for file loading: "RAW" - don't convert please. "AUTOMATIC" - please convert as necessary...let the application decide. My reasoning is that the specification should allow applications to be mathematically correct - which implies a color space conversion. However, if you were to do something like this on a cellphone with 5/6/5 textures and frame buffer - then the results would be horrible...also, there are situations where this operation might be in a performance-critical pathway. So the pragmatic option is to say "don't mess with my pixels". * Since I strongly disagree with forcing mathematical incorrectness into the specification, we need to be able to convert between color spaces. * Since I strongly require to be able to kludge around ugly roundoff problems and to fix performance issues, and to do "non-traditional" rendering (like doing physics and collision detection in the GPU!). I need to be able to turn that off in my application. A good specification 'allows' elegant, automatic correctness - but doesn't mandate it where an individual application has performance or quality needs that might suffer as a result. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From tu...@ Sun Sep 5 08:46:43 2010 From: tu...@ (Thatcher Ulrich) Date: Sun, 5 Sep 2010 11:46:43 -0400 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> Message-ID: On Sun, Sep 5, 2010 at 7:51 AM, stephen white wrote: > On 05/09/2010, at 8:26 PM, Thatcher Ulrich wrote: >> It seems to me there are two unresolved questions for WebGL 1.0 > > I'm not sure that my point was understood, so I'll try again... > > When a browser composites an image onto the page, that image is the entire block of pixels. The browser can have simple rules because of that. > > When WebGL draws to its canvas, it is using a number of images within the block of pixels. Therefore the number of different images may have different colour spaces and/or gammas. > > The additional complexity is coming from multiple images used together, which is not a problem that simply displaying an image had to worry about. > > As far as I can work this out, Steve Baker's suggestion of an option to reverse-map back to a linear colour space is the only mathematically valid option. > > In practical terms, this would work out to a "WEBGL_MAP_TO_LINEAR" loading time option that doesn't do any work for linear PNGs but does fix up colour space JPGs. Just a point of clarification/correction: PNG has the ability to specify linear, but the vast majority of PNGs in the known universe (i.e. nearly every PNG on the web) are sRGB or equivalent. Linear with 8-bit color depth is not a good interchange format; that's why virtually nobody uses it. -T > > If the option isn't set, then the JPGs are not touched and it's up to the programmer to do what they think is best. > > -- > ?steve...@ > > > ----------------------------------------------------------- > You are currently subscribed to public_webgl...@ > To unsubscribe, send an email to majordomo...@ with > the following command in the body of your email: > > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From ste...@ Sun Sep 5 10:58:12 2010 From: ste...@ (Steve Baker) Date: Sun, 05 Sep 2010 12:58:12 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <899F97BE-66B9-4B7B-9142-8B0F26A802EA@adam.com.au> Message-ID: <4C83DA34.10208@sjbaker.org> Thatcher Ulrich wrote: > Just a point of clarification/correction: PNG has the ability to > specify linear, but the vast majority of PNGs in the known universe > (i.e. nearly every PNG on the web) are sRGB or equivalent. > Yes - that's absolutely true - but you have to realize that this is because they aren't being used for 3D rendering. That's about to change. > Linear with 8-bit color depth is not a good interchange format; that's > why virtually nobody uses it. > Nobody ON THE WEB uses it. In computer games and simulation it's what virtually everybody uses. So why is that? * When you need to take a photo and paste it onto a screen - you store it in sRGB space so you can splat it onto the screen with no intervening math whatever. It's an efficient use of limited network bandwidth and CPU horsepower. Everyone wins. That's also why JPEG is so popular. * When you take an image and push it through a 3D renderer that's doing a TON of hideously expensive math - you store it in the color space best suited to doing heavy duty math using cheap hardware (which is linear) and you pay the price to convert the results into sRGB (to "gamma correct" it) at the very last possible moment. Hardly any game or simulation people use JPEG for that exact same reason. We're about to bring those two worlds smashing together...and that's why there is an issue here. -- Steve ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From tu...@ Sun Sep 5 13:05:49 2010 From: tu...@ (Thatcher Ulrich) Date: Sun, 5 Sep 2010 16:05:49 -0400 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C83AD38.5050404@sjbaker.org> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <4C83AD38.5050404@sjbaker.org> Message-ID: On Sun, Sep 5, 2010 at 10:46 AM, Steve Baker wrote: > Thatcher Ulrich wrote: >> On Sat, Sep 4, 2010 at 9:45 PM, Steve Baker wrote: >> >>> I agree that there are undoubtedly going to be issues with roundoff >>> error and precision. ?In the short term, I'll do what I've always done - >>> instructed my artists to paint textures in 'linear' color space and to >>> have their monitors gamma color-calibrated every six months to ensure >>> that they're seeing those textures optimally. >>> >> >> What format & depth are your artists saving their work in? ?And then >> what texture format(s) do you convert to? ?How are you planning to get >> this data into WebGL? >> > Well - I have to answer this two ways - because I'm doing two jobs. > > * In my (paying) day job - I am the graphics lead for Total Immersion > (www.totimm.com) - but I saw the same practices when I worked for Midway > Games and in many games & simulation companies before that. ?I'm writing > D3D (yuk!) graphics engines for 'serious games' for things like training > firefighters and the pilots of unmanned drone aircraft and such. ?The > artists use 8/8/8 or 8/8/8/8 PNG or TIFF (some GIS terrain tools use it) > with a gamma of 1.0 (ie, linear color space) and we convert to DDS for > loading into the game. You really store with gamma=1.0 8/8/8/8?? For diffuse textures, right? I'm surprised by that.? Don't you find that you get objectionable banding in darker areas of images? I sanity checked with some industry friends, and the consensus so far is sRGB for diffuse maps.? The equivalents of EXT_texture_sRGB and EXT_framebuffer_sRGB are used to get proper sampling.? There are varying opinions on how to encode normal maps and specular maps, but some advocate linear.? Environment and light maps may benefit from higher dynamic range so there are a variety of things used: RGBE (exponential), RGBM (linear with a separate coefficient in A), two maps combined, other stuff I've never heard of, etc. This is corroborated by the GDC slides from John Hable (Naughty Dog, EA) concerning the Uncharted 2 lighting. http://www.slideshare.net/ozlael/hable-john-uncharted2-hdr-lighting There are a lot of slides there, but note especially: * slide 26, showing egregious banding artifacts of using 8-bit linear color * slide 30, showing the benefit of linear lighting * slide 36, showing that you can explicitly linearize sRGB in your shader * slide 37, saying that it's better if you get the hardware to do it with D3DSAMP_SRGBTEXTURE and D3DRS_SRGBWRITEENABLE (the equivalents of EXT_texture_sRGB and EXT_framebuffer_sRGB) * slides 46-49 where he talks about which gamma to use for which kinds of texture maps. (sRGB for Diffuse, linear for Normal Maps, linear or sRGB for Specular and Ambient Occlusion) Note that none of this implies that any manipulation of the color components should be done by the graphics API in the PixelStore pipeline. It is assumed that devs will feed the game engine the raw data that the renderer is expecting to consume. So I stand by my position that WebGL should, by default, not mess with anybody's color components. If you want to add some explicit parameters to do some manipulation, then feel free, but please don't call it "correction" and please don't make it a default. I think it's an unnecessary distraction for this group, and likely to lead some people astray, when what would be much more helpful is getting support for EXT_texture_sRGB and EXT_framebuffer_sRGB sometime soon after WebGL 1.0 ships. Also, I think it would be a disaster for WebGL to spec any framebuffer format other than sRGB. Linear 8/8/8/8 in particular is IMO harmful as a final output buffer, because it will cause banding in dark areas. Anybody who tries to implement a photo gallery app in WebGL will pull their hair out. In the future when browsers are fancier it would be nice to have more options, but sRGB is the right way to go for now. This course has the benefit of least surprise for existing OpenGL programmers, and least deviation for WebGL standardizers and implementors. If I'm not mistaken, this would allow Steve to keep his existing pipeline, though he would need a linear-to-sRGB operation at the very end of his pipeline. (Which he needs now anyway on non-WebGL platforms, so it's not an extra hardship.) As less important issues, but perhaps useful for standardization, it would be nice to get the browsers on the same page w/r/t what to do with PNGs that specify gamma. Also, it would be nice to provide DOM accessors to things like the gamma (if any) that a PNG image specifies, the bit depth and channels that are provided by a PNG, etc. (Similar issues came up during the discussion that led to adding an explicit internalformat parameter to texImage2D.) And someday it would be nice to have a higher-precision texture pipeline -- ingesting 16-bit PNGs and putting them in floating point or 16/16/16/16 textures without losing information. -T > ?DDS supports DXT1/3/5 compression and > no-compression which gives us the choice to use lossy compression where > space is critical. ?We also have the advantage of stating 'minimum > hardware specs' which means we never have to deal with cellphones or > hardware that doesn't support full floating point textures and shaders. > One day we'd LOVE to be able to use WebGL to support this stuff - but > we're far from there yet. ? Part of the reason I'm following this > mailing list is that I want to be sure that WebGL could ultimately > support serious simulation engines and run AAA quality video games. > > * In my spare time (hahahah!) I'm starting with a small, dedicated (and > as yet unpaid) team to put together an experimental set of ?WebGL-based > games to see whether we can make money using adverts and T-shirt sales > and in-game revenue (pay us a dollar and get that neat weapon you always > wanted!). ? We currently use 8/8/8 and 8/8/8/8 PNG with a gamma of 1.0 > (ie linear color space) and do no further compression. ?Having looked > carefully at ETC1, and kicked around some ideas with the guy at Ericsson > who invented it, it's clear that it may prove useful for some kinds of > data - but I'm still a little skeptical about it in general. ?There may > be some super-devious shaderly tricks to make it do things it was never > designed to do...but that's still a matter of investigation for me. > > In the latter case, my biggest concern is with platforms that may not > support 8/8/8 or 8/8/8/8 formats internally and which may crunch them > down to 5/6/5 or 5/5/5/1 or 4/4/4/4 formats internally. ?Since I use > texture in 'non-traditional' ways - I'm having to get creative about > only using the high order 4 bits in some situations. ?It's painful. ?So > my best guess right now is that I'll be using the built-in file loaders > with linear color space PNG. >> There are some relevant constraints on WebGL: >> >> * browsers (currently) work with 8-bit sRGB_Alpha color buffers, so >> that's the format that WebGL output ends up in. ?I don't think WebGL >> 1.0 can realistically spec anything else for output. ?(In the future, >> perhaps browsers will be able to handle linear color spaces at higher >> bit depths, but I don't know if anybody is seriously considering that >> yet, so I doubt there's any point in spec'ing anything in WebGL 1.0.) >> > That's great. >> * WebGL has texImage2D calls that take raw buffers of data. ?Currently >> (default) behavior passes this data straight to the texture formats, >> so any automatic gamma treatment would be a change. >> > Yes - that is true. ?I think we need the specification to say that you > CAN automatically convert color spaces when you do this - but not that > you MUST. ?That's necessary, not least because with low end cellphone > hardware, repeatedly converting back and forth between color spaces will > introduce so much rounding error that it'll be unusable - so > applications will want to disable that (even if it's "mathematically > required") in order to get the lesser of two evils. > > However, IMHO, the specification needs to support (at least in theory) > mathematical correctness and must never impos**e ?incorrectness because > by the time this specification becomes obsolete, we'll probably have 12 > or 16 bit integer and 16 or 32 bit floating point per component (I > already use 32/32/32/32 for HDR lighting in some places in my "day job" > graphics engine) and doing this conversion will ultimately be entirely > painless. >> * browsers can load image formats in PNG and JPEG, which are most >> typically 8-bit sRGB. ?WebGL behavior when using these formats is >> definitely worth spec'ing. >> > Yes. >> * PNG has the ability to specify a per image gamma value (the gAMA >> thing referenced earlier). ?Browsers appear to handle this >> differently. ?see this reference page, at "Images with gamma chunks": >> http://www.libpng.org/pub/png/pngsuite.html ?On the Mac I'm using >> right now, Chrome and Safari do not do gamma correction, while Firefox >> does. ?You can also clearly see the quantization errors in the Firefox >> images with the lower gamma values. ?The Chrome and Safari behavior is >> (arguably) a bug. >> > Yes, I agree. >> * PNG has the ability to store 16-bit color depth. ?However, my >> understanding is that current browsers take all input images >> (including PNG images with 16-bit color depth) and convert them to >> sRGB_Alpha internally, before WebGL has a chance to see the data. >> Also, the WebGL spec does not appear to have any texture formats that >> have more than 8 bits per color component. ?This would be a great >> thing to improve, post WebGL 1.0, since hi-fi WebGL apps could make >> good use of it. >> > Absolutely. ?Supporting (in particular) floating point textures would be > a big win...but there are many desktop/laptop chipsets that can't do > that - and I fear it'll be a good few years before cellphones can do > that. ? But this is doable as an extension. ?The issues of color space > correctness are the underpinnings of the specification and should be > handled rigorously from the get-go because inserting them later would be > tortuous and disruptive. >> It seems to me there are two unresolved questions for WebGL 1.0 >> >> 1) Should WebGL attempt to nail down how browsers are supposed to >> handle PNG's with a non-default gAMA value? ?Viable options here are: >> >> ? a) leave it up to the browser (status quo, but behavior may differ >> among browser; in practice apps will have to supply sRGB data and do >> any additional conversions themselves). >> > Not tolerable. ?Because textures are very often used for storing things > other than images and writing our own texture loaders in JavaScript is > ridiculous - doing any kind of automatic color space conversion without > a way to turn it off would make WebGL useless for all but the simplest > kinds of 3D graphics. ? Worse still, because JavaScript is slow, we have > to push more work into the GPU than on a traditional OpenGL or D3D > platform - that INCREASES the number of situations where we use texture > "non-traditionally" in order to get good performance. ?If this were the > choice - I'd stop work on my WebGL games. >> ? b) demand conversion to sRGB_Alpha based on PNG metadata (i.e. >> converge on current Firefox behavior, however non-sRGB behavior will >> be a corner case for browsers and smart WebGL developers may opt to >> always supply sRGB data and do any conversions themselves) >> >> ? c) demand passing raw data straight through. ?16-bit components >> would be rounded or truncated to 8-bit. ?(i.e. converge on current >> WebKit behavior, similar caveats as option b) >> > d) Have the WebGL texture file loader convert whatever color space the > file is in - to a uniform linear color space. ?With the option to turn > that conversion off for files that contain non-traditional (non-image) > data - and in cases where the roundoff error inherent in the conversion > is not acceptable - or when the application knows that the source data > is in linear color space regardless of what the file header happens to say. > > i.e. Support both and let the application decide. >> 2) Should WebGL add a PixelStore option that does some kind of gamma >> conversion? ?(Where the thread started.) ?IMO the status quo (do >> nothing) is pretty much fine. ?Apps that want a specific >> interpretation of their image data can either pre-process it so the >> raw data matches what they want in texture RAM, or else do custom >> processing via GPU with the existing facilities. >> > Since we've now established (I hope!) that the canvas spec requires that > the linear color space WebGL canvas must be converted into 'device color > space' at some point before it hit the screen (which for us probably > means "it's gamma corrected in the compositor") - there is absolutely no > reason to ever want to do this. ?Automatically converting linear-space > textures into gamma space then converting the rendering results into > gamma space would guarantee an ugly mess on the output. ?That's a waste > of CPU time, implementation effort and it's mathematically indefensible. >>>> I believe EXT_texture_sRGB is designed to make it easier to be "linear >>>> correct" under these circumstances. ?It basically allows you to tell >>>> OpenGL that your 8-bit texture data is in sRGB format, and that you >>>> want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion >>>> when sampling the data from a shader. ?Your shader will operate on >>>> linear data (in floating point format), and the frame buffer is still >>>> your shader's problem >>>> Yes - but it's an extension that we can't rely on always being >>>> implemented (I doubt ANGLE could emulate it either). >>>> >> >> It can be trivially implemented on any hardware that has an internal >> texture format with at least 12 bits per component; just convert the >> data to linear and store in the higher-depth format. ?The practical >> problem is that it wastes texture RAM, hence the preference to have >> lookup tables in the GPU. >> > But we're potentially operating with hardware that may not even support > 8 bits per component let alone 12! ?So it certainly cannot be "trivially > implemented" on all WebGL clients. ?Hence we certainly can't rely on it > - and most certainly we can't write our specification based upon it! > > Worse still, implementing it simply with "lookup tables in the GPU" > (which, I'll grant that the extension spec allows) means that the sRGB > texels are linearly interpolated for GL_LINEAR and > GL_LINEAR_MIPMAP_LINEAR textures - and that's wrong! ? When you minify a > texture, the hardware is (in effect) calculating the contributions of > four texels from each of two MIP levels. ? In linear color space, that's > a simple lerp operation that's super-cheap to do in hardware - but in > sRGB, doing that that gives too much weight to the bright texels and not > enough to the dark ones. ?The consequences are that the texture will > alias along bright-to-dark transitions. > > I can't imagine many GPU manufacturers building proper sRGB > interpolators into the highest bandwidth part of their engines - so I'll > be surprised if many of them support this extension "the right > way"...which makes it all but useless to people who want nice-looking > graphics. > > ?-- Steve > > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From bck...@ Sun Sep 5 13:39:13 2010 From: bck...@ (Brendan Kenny) Date: Sun, 5 Sep 2010 15:39:13 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <4C83AD38.5050404@sjbaker.org> Message-ID: On Sun, Sep 5, 2010 at 3:05 PM, Thatcher Ulrich wrote: > This is corroborated by the GDC slides from John Hable (Naughty Dog, > EA) concerning the Uncharted 2 lighting. > http://www.slideshare.net/ozlael/hable-john-uncharted2-hdr-lighting > > There are a lot of slides there, but note especially: > > * slide 26, showing egregious banding artifacts of using 8-bit linear color > > * slide 30, showing the benefit of linear lighting > > * slide 36, showing that you can explicitly linearize sRGB in your shader > > * slide 37, saying that it's better if you get the hardware to do it > with D3DSAMP_SRGBTEXTURE and D3DRS_SRGBWRITEENABLE (the equivalents of > EXT_texture_sRGB and EXT_framebuffer_sRGB) > > * slides 46-49 where he talks about which gamma to use for which kinds > of texture maps. ?(sRGB for Diffuse, linear for Normal Maps, linear or > sRGB for Specular and Ambient Occlusion) > > Note that none of this implies that any manipulation of the color > components should be done by the graphics API in the PixelStore > pipeline. ?It is assumed that devs will feed the game engine the raw > data that the renderer is expecting to consume. Maybe I'm missing something, but looking at those slides it seems that he is suggesting that diffuse and (maybe) specular and ambient occlusion maps should be stored with gamma correction so that there is more data in the dark end (and so they're more easily edited by artists), but that when images are stored in "gamma-space" that they should *always* be sampled non-linearly (see slide 46). That could be done manually in the shader, but is also the equivalent, since all the images are being loaded by the browser, of converting them on import. ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From bck...@ Sun Sep 5 13:39:13 2010 From: bck...@ (Brendan Kenny) Date: Sun, 5 Sep 2010 15:39:13 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <4C83AD38.5050404@sjbaker.org> Message-ID: On Sun, Sep 5, 2010 at 3:05 PM, Thatcher Ulrich wrote: > This is corroborated by the GDC slides from John Hable (Naughty Dog, > EA) concerning the Uncharted 2 lighting. > http://www.slideshare.net/ozlael/hable-john-uncharted2-hdr-lighting > > There are a lot of slides there, but note especially: > > * slide 26, showing egregious banding artifacts of using 8-bit linear color > > * slide 30, showing the benefit of linear lighting > > * slide 36, showing that you can explicitly linearize sRGB in your shader > > * slide 37, saying that it's better if you get the hardware to do it > with D3DSAMP_SRGBTEXTURE and D3DRS_SRGBWRITEENABLE (the equivalents of > EXT_texture_sRGB and EXT_framebuffer_sRGB) > > * slides 46-49 where he talks about which gamma to use for which kinds > of texture maps. ?(sRGB for Diffuse, linear for Normal Maps, linear or > sRGB for Specular and Ambient Occlusion) > > Note that none of this implies that any manipulation of the color > components should be done by the graphics API in the PixelStore > pipeline. ?It is assumed that devs will feed the game engine the raw > data that the renderer is expecting to consume. Maybe I'm missing something, but looking at those slides it seems that he is suggesting that diffuse and (maybe) specular and ambient occlusion maps should be stored with gamma correction so that there is more data in the dark end (and so they're more easily edited by artists), but that when images are stored in "gamma-space" that they should *always* be sampled non-linearly (see slide 46). That could be done manually in the shader, but is also the equivalent, since all the images are being loaded by the browser, of converting them on import. ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From bck...@ Sun Sep 5 13:48:11 2010 From: bck...@ (Brendan Kenny) Date: Sun, 5 Sep 2010 15:48:11 -0500 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C828B6A.3040501@sjbaker.org> <4C82F628.1030906@sjbaker.org> <4C83AD38.5050404@sjbaker.org> Message-ID: On Sun, Sep 5, 2010 at 3:39 PM, Brendan Kenny wrote: > On Sun, Sep 5, 2010 at 3:05 PM, Thatcher Ulrich wrote: >> This is corroborated by the GDC slides from John Hable (Naughty Dog, >> EA) concerning the Uncharted 2 lighting. >> http://www.slideshare.net/ozlael/hable-john-uncharted2-hdr-lighting >> >> There are a lot of slides there, but note especially: >> >> * slide 26, showing egregious banding artifacts of using 8-bit linear color >> >> * slide 30, showing the benefit of linear lighting >> >> * slide 36, showing that you can explicitly linearize sRGB in your shader >> >> * slide 37, saying that it's better if you get the hardware to do it >> with D3DSAMP_SRGBTEXTURE and D3DRS_SRGBWRITEENABLE (the equivalents of >> EXT_texture_sRGB and EXT_framebuffer_sRGB) >> >> * slides 46-49 where he talks about which gamma to use for which kinds >> of texture maps. ?(sRGB for Diffuse, linear for Normal Maps, linear or >> sRGB for Specular and Ambient Occlusion) >> >> Note that none of this implies that any manipulation of the color >> components should be done by the graphics API in the PixelStore >> pipeline. ?It is assumed that devs will feed the game engine the raw >> data that the renderer is expecting to consume. > > Maybe I'm missing something, but looking at those slides it seems that > he is suggesting that diffuse and (maybe) specular and ambient > occlusion maps should be stored with gamma correction so that there is > more data in the dark end (and so they're more easily edited by > artists), but that when images are stored in "gamma-space" that they > should *always* be sampled non-linearly (see slide 46). > > That could be done manually in the shader, but is also the equivalent, > since all the images are being loaded by the browser, of converting > them on import. Sorry, it is not equivalent, in fact. The problem is that sampling does always (always) need to be done non-linearly, but when done in a shader the result is (usually, not sure about the full breadth of hardware here) at least 16 bit floating point. If the image is just converted to a linear 8-bit buffer, precision loss and banding on the dark end will of course occur, as Thatcher noted. ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cal...@ Mon Sep 6 00:50:19 2010 From: cal...@ (Mark Callow) Date: Mon, 06 Sep 2010 16:50:19 +0900 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <75999fb813488db76c8bb94325e4c07f.squirrel@webmail.sjbaker.org> References: <4C8070CC.9030702@sjbaker.org> <4C808F24.6070805@hicorp.co.jp> <75999fb813488db76c8bb94325e4c07f.squirrel@webmail.sjbaker.org> Message-ID: <4C849D3B.8020607@hicorp.co.jp> I'm still very focused on native GL applications. Since GL does not have blend (compositing mode) shaders an extra pass is needed to "fix gamma using a shader in the compositing stage." This will cause a performance drop compared to rendering directly to a window surface. But as you point out, WebGL is already performing this extra pass to composite the canvas with the other page contents and it makes a great deal of sense to perform color space conversion during that pass. Regards -Mark On 03/09/2010 22:15, steve...@ wrote: >> On 03/09/2010 12:51, Steve Baker wrote: >>> ... >>> 3) You are doing no lighting/blending/mipmapping/fog/etc and (for some >>> reason) you have also chosen not to do gamma correction at the end. In >>> that case and ONLY in that case, you should gamma-correct your textures >>> on input. >>> >>> I maintain that very few WebGL applications will do (3). >> I think that the number of mobile devices which have "gamma correctors" >> is approaching 0 and, with the exception of doing the "correction" in a >> shader (which will screw up blending), control of any such "correctors" >> is outside OpenGL. So I suspect the number of applications doing 3 is >> quite large. > I strongly disagree with every single thing you just said! > > 1) EVERY mobile device that can support WebGL is capable of rendering the > final image to the screen (the "compositing" stage) by drawing a textured > quadrilateral using a gamma correcting shader. There is no need for > custom gamma-correcting CLUT hardware anymore...that's what we have > shaders for. > > 2) If you decided to put the gamma correction at the end of the shader(s) > that you use for rendering your 3D scene (which I most certainly don't > advocate!), it would indeed "screw up blending" - but less so than > applying the gamma to the textures before the shader runs. Gamma is a > non-linear effect and as such has to come after all of the linear effects > in the rendering pipeline. > > 3) You say that control of external gamma correctors is outside of OpenGL > - that's true but I didn't suggest that we have to use an external gamma > corrector. I specifically said that we can fix gamma using a shader in > the compositing stage. > > 4) You can't say how many applications are "doing 3" because there are (by > definition) no finished WebGL applications yet (because the specification > isn't 100% finished). The only applications that might fall into class > (3) are the ones that don't do ANY > lighting/anti-aliassing/MIPmapping/texture-magnification/fogging/alpha-blending > or translucent-canvas compositing. Basically, every single 3D application > is class (1) or (2) - and preferably class (2) because (1) is an ugly > kludge. True class (3) applications should probably be using > directly. > > If the specification were to say that the compositor does gamma correction > by default (possibly with the option to turn that off for people who don't > want it for some very specific reason) then everyone should be happy and > we do things correctly without any nasty kludges hardwired into the > system. > > -- Steve > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: callow_mark.vcf Type: text/x-vcard Size: 398 bytes Desc: not available URL: From cal...@ Mon Sep 6 01:42:11 2010 From: cal...@ (Mark Callow) Date: Mon, 06 Sep 2010 17:42:11 +0900 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> Message-ID: <4C84A963.5090908@hicorp.co.jp> I have several points which I am grouping in this single message rather sending a flood of new messages. Chris Marrin wrote: > > AFAIK, gamma correction is done to make images look right on the > selected display. It has nothing to do with data in the source image. > I believe some images might have color correction information in them, > but that's different from gamma correction. > The necessary correction most definitly has something to do with the data in the source image. It is dependent on that, the display and the viewing conditions. Judging from your later posts, I think you have already realized this. Ken Russell wrote: > I see nothing in the JPEGImageDecoder related to gamma. Is anything > needed for this file format? I suspect people will not use JPEGs for > anything they expect to be passed through verbatim to WebGL, such as > encoding non-color information in the color channels of a texture. There is no such thing as the JPEG file format. There are two file formats in common use that store JPEG compressed images: JFIF and EXIF. The JFIF spec. does not include color space information. However it does provide application tags that can be used to store this information. One example of using an application tag is EXIF. EXIF, the output format of the majority of digital still cameras, uses an application tag and in it writes a lot of metadata about the image including the color space information. Vrtually all cameras include the color space information when writing this tag. If JPEGImageDecoder is not doing anything related to gamma, it is incorrect. Just like the PNG case it should be reading the EXIF tags if present otherwise assuming a gamma of 2.2 is reasonable. Chris Marrin wrote: > All that is a pretty clear indication that the pixels in the canvas are expected to be in the sRGB color space and when they are composited they are transformed into the display's color space. An author who really cares, can render textures into the WebGL canvas knowing the image is in the sRGB space and that the final image in the canvas should be in the sRGB space, and apply the appropriate factors to make that so. Since we don't have blend shaders the only way to do this correctly is to create another renderbuffer and do another pass over the data. But since WebGL is already using a renderbuffer to composite the canvas with the page, the only approach that makes sense performance wise is for the browser to do the conversion while compositing the page. So the canvas to be in a physically linear space like the ICC profile connection space. Steve Baker wrote: > * The PNG file format stores things in linear color space. If you plan > to display them on a NON gamma corrected medium - then you need to apply > gamma to it...which (I presume) is what that snippet of code that you > presented actually does. and > - no need to convert PNGs because they are already linear. This is incorrect. PNG provides gAMA, cHRM, sRGB and iCCP metadata chunks to allow the encoder to include information about the color space of the image samples. In the absence of any of these chunks in the file, the spec says When the incoming image has unknown gamma (gAMA, sRGB, and iCCP all absent), choose a likely default gamma value, but allow the user to select a new one if the result proves too dark or too light. The default gamma can depend on other knowledge about the image, like whether it came from the Internet or from the local system. Nowhere does it suggest that a likely default value is 1.0 (linear). If any of the above chunks do exist, the decoder is supposed to use them to display the image correctly. Steve Baker wrote: > I think if you reverse-gamma JPEG files and leave everything else alone, > you'll be OK. No. See above. And some final notes... The OpenGL sRGB extensions are rather misnamed. They only really pay attention to the transfer function (a.k.a gamma) and ignore the other parts of sRGB such as chromaticities and white & black points. Since OpenGL does not specify a color space, they don't have much choice. When using sRGB textures, GL converts the incoming texture data to a physically linear space. When using sRGB renderbuffers, GL converts the blended & multisampled output to the perceptually-linear space of sRGB. I believe the correct thing to do in WebGL is specify that the canvas color space is the ICC profile connection space. The transfer function of this space is physically linear. All other aspects of the color space are also specified. For the purposes of the computations specified by OpenGL, these don't matter. But for correct conversion from the input space of the images to the output space of the display they are very important. Using the PCS enables the browser to use the relevant ICC profiles for conversion. Regards -Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: callow_mark.vcf Type: text/x-vcard Size: 398 bytes Desc: not available URL: From tu...@ Mon Sep 6 02:54:55 2010 From: tu...@ (Thatcher Ulrich) Date: Mon, 6 Sep 2010 05:54:55 -0400 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C84A963.5090908@hicorp.co.jp> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C84A963.5090908@hicorp.co.jp> Message-ID: I agree with virtually everything you said, except for two important points, see comments below: On Mon, Sep 6, 2010 at 4:42 AM, Mark Callow wrote: > I have several points which I am grouping in this single message rather > sending a flood of new messages. > > Chris Marrin wrote: > > AFAIK, gamma correction is done to make images look right on the selected > display. It has nothing to do with data in the source image. I believe some > images might have color correction information in them, but that's different > from gamma correction. > > The necessary correction?most definitly has something to do with the data in > the source image. It is dependent on that, the display and the viewing > conditions. Judging from your later posts, I think you have already realized > this. > > Ken Russell wrote: > > I see nothing in the JPEGImageDecoder related to gamma. Is anything > needed for this file format? I suspect people will not use JPEGs for > anything they expect to be passed through verbatim to WebGL, such as > encoding non-color information in the color channels of a texture. > > There is no such thing as the JPEG file format. There are two file formats > in common use that store JPEG compressed images: JFIF and EXIF. The JFIF > spec. does not include?color space information. However it does provide > application tags that can be used to store this information. One example of > using an application tag is EXIF. EXIF, the output format of the majority of > digital still cameras, uses an application tag and in it writes a lot of > metadata about the image including the color space information. Vrtually all > cameras include the color space information when writing this tag. > > If JPEGImageDecoder is not doing anything related to gamma, it is incorrect. > Just like the PNG case it should be reading the EXIF tags if present > otherwise assuming a gamma of 2.2 is reasonable. > > Chris Marrin wrote: > > All that is a pretty clear indication that the pixels in the canvas are > expected to be in the sRGB color space and when they are composited they are > transformed into the display's color space. An author who really cares, can > render textures into the WebGL canvas knowing the image is in the sRGB space > and that the final image in the canvas should be in the sRGB space, and > apply the appropriate factors to make that so. > > Since we don't have blend shaders the only way to do this correctly is to > create another renderbuffer and do another pass over the data. Actually, an app author can do this directly in their fragment shader. Just do "gl_FragColor = pow(linear_rgb, 1.0 / 2.2);" at the end of the shader. When/if WebGL exposes EXT_framebuffer_sRGB, the hardware can do this more cheaply (and perhaps more exactly) using a lookup table. > But since > WebGL is already using a renderbuffer to composite the canvas with the page, > the only approach that makes sense performance wise is for the browser to > ?do the conversion while compositing the page. So the canvas to be in a > physically linear space like the ICC profile connection space. I admit I had to look up "ICC profile connection space". I didn't get much clarity out of the color.org site, but Wikipedia says it's based on either CIELAB or CIEXYZ. I'm not sure I've got it right -- are you saying the canvas should be in CIELAB (or CIEXYZ) coordinates? I.e. our fragment shaders need to write L, a, and b coords (or X, Y, Z coords) instead of r, g, and b? -T > > Steve Baker wrote: > > * The PNG file format stores things in linear color space. If you plan > to display them on a NON gamma corrected medium - then you need to apply > gamma to it...which (I presume) is what that snippet of code that you > presented actually does. > > and > > - no need to convert PNGs because they are already linear. > > This is incorrect. PNG provides gAMA, cHRM, sRGB and iCCP metadata chunks to > allow the encoder to include information about the color space of the image > samples. In the absence of any of these chunks in the file, the spec says > > When the incoming image has unknown gamma (gAMA, sRGB, and iCCP all absent), > choose a likely default gamma value, but allow the user to select a new one > if the result proves too dark or too light. The default gamma can depend on > other knowledge about the image, like whether it came from the Internet or > from the local system. > > Nowhere does it suggest that a likely default value is 1.0 (linear). If any > of the above chunks do exist, the decoder is supposed to use them to display > the image correctly. > > Steve Baker wrote: > > I think if you reverse-gamma JPEG files and leave everything else alone, > you'll be OK. > > No. See above. > > And some final notes... > > The OpenGL sRGB extensions are rather misnamed. They only really pay > attention to the transfer function (a.k.a gamma) and ignore the other parts > of sRGB such as chromaticities and white & black points. Since OpenGL does > not specify a color space, they don't have much choice. > > When using sRGB textures, GL converts the incoming texture data to a > physically linear space. When using sRGB renderbuffers, GL converts the > blended & multisampled output to the perceptually-linear space of sRGB. > > I believe the?correct thing to do in WebGL is specify that the canvas color > space is the ICC profile connection space. The transfer function of this > space is physically linear. All?other aspects of the color space are also > specified. For the purposes of the computations specified by OpenGL, these > don't matter. But for correct conversion from the input space of the images > to the output space of the display they are very important. Using the PCS > enables the browser to use the relevant ICC profiles for conversion. > > Regards > > ??? -Mark > > > > ----------------------------------------------------------- You are currently subscribed to public_webgl...@ To unsubscribe, send an email to majordomo...@ with the following command in the body of your email: From cma...@ Mon Sep 6 05:34:23 2010 From: cma...@ (Chris Marrin) Date: Mon, 06 Sep 2010 05:34:23 -0700 Subject: [Public WebGL] Gamma correction and texImage2D/texSubImage2D In-Reply-To: <4C84A963.5090908@hicorp.co.jp> References: <6C663C21-FEE2-4CD6-852A-3EE4C28457B1@apple.com> <4C84A963.5090908@hicorp.co.jp> Message-ID: <1D5ED179-ECA1-4AE3-B54F-15F85553A3B5@apple.com> Thanks for all the great comments Mark. More below... On Sep 6, 2010, at 1:42 AM, Mark Callow wrote: >> ...- no need to convert PNGs because they are already linear. > > This is incorrect. PNG provides gAMA, cHRM, sRGB and iCCP metadata chunks to allow the encoder to include information about the color space of the image samples. In the absence of any of these chunks in the file, the spec says > When the incoming image has unknown gamma (gAMA, sRGB, and iCCP all absent), choose a likely default gamma value, but allow the user to select a new one if the result proves too dark or too light. The default gamma can depend on other knowledge about the image, like whether it came from the Internet or from the local system. > Nowhere does it suggest that a likely default value is 1.0 (linear). If any of the above chunks do exist, the decoder is supposed to use them to display the image correctly. But we need to make an assumption if the incoming PNG image omits any color space information. I think it is out of scope to add the ability to let the author choose the color space for these images as they are read in with texImage2D. So do we choose sRGB or linear? > > Steve Baker wrote: >> I think if you reverse-gamma JPEG files and leave everything else alone, >> you'll be OK. > No. See above. > > And some final notes... > > The OpenGL sRGB extensions are rather misnamed. They only really pay attention to the transfer function (a.k.a gamma) and ignore the other parts of sRGB such as chromaticities and white & black points. Since OpenGL does not specify a color space, they don't have much choice. > > When using sRGB textures, GL converts the incoming texture data to a physically linear space. When using sRGB renderbuffers, GL converts the blended & multisampled output to the perceptually-linear space of sRGB. >From my reading, incoming textures are not necessarily converted. They may be converted when accessed to preserve more of the color data. In fact, I think the spec recommends doing that to preserve one of the advantages of using sRGB images, to increase the resolution of the dark parts of an image. But I may be misreading. > > I believe the correct thing to do in WebGL is specify that the canvas color space is the ICC profile connection space. The transfer function of this space is physically linear. All other aspects of the color space are also specified. For the purposes of the computations specified by OpenGL, these don't matter. But for correct conversion from the input space of the images to the output space of the display they are very important. Using the PCS enables the browser to use the relevant ICC profiles for conversion. Yes, I believe the consensus is to use a linear color space for the drawing buffer representation. I know in the WebKit implementation we need to add the appropriate color space conversions in the HTML compositor. Now the only question is what to do with the incoming texture data. ----- ~Chris cmarrin...@ -------------- next part -------------- An HTML attachment was scrubbed... URL: From cma...@ Mon Sep 6 05:45:49 2010 From: cma...@ (Chris Marrin) Date: Mon, 06 Sep 2010 05:45:49 -0700 Subject: [Public WebGL] Proposed change to WebGL Event definition In-Reply-To: References: <33F15A1B-FF87-44F1-ABA3-632494FA649A@apple.com> <817D1DD8-125C-44B0-95FF-C288EECB06E0@apple.com> <10C92111-C1E5-429F-AF48-D4737C07BCEE@apple.com> <044681C5-76B1-4D4B-AF5C-4BD34203350B@apple.com> Message-ID: <3AAA5F6C-9EB1-4701-B32D-9FC9F0E7ABB9@apple.com> On Sep 6, 2010, at 3:44 AM, Dean Jackson wrote: > > On 04/09/2010, at 11:57 PM, Chris Marrin wrote: > >> I think the right way to handle that is to add a media query for WebGL. We've made a proposal to add media queries for CSS animations, transitions, and transforms. These are all supported in WebKit today. It would be easy to add one for 'webgl'. You can run the query from JS and know you don't have WebGL without having to call getContext(). But the really nice thing about media queries is that you can use them in CSS style sheets. If WebGL is missing you can style the page differently for, for instance, not take up the space for the WebGL canvas. >> >> So the proposal is to get rid of statusCode and add a media query for 'webgl'. I will talk to dino, who is pushing the media query extensions spec, about this as well. > > For the feature detection part, yes, CSS media queries are fairly nice. They have the benefit of running inside a style sheet, as well as a JS interface for on-the-fly queries (implemented in WebKit at least). They have a slight downside in that they are not evaluated in browsers that don't understand the feature - so you have to use them in a way that adds functionality when the feature is there, rather than provides a fallback when it isn't (if that makes sense :) Yes, it sounds like in order to make this feature really effective, we would have to require that all browsers supporting WebGL would have to support media queries, at least those related to WebGL. We'd need to do this: 1) Do a media query for WebGL support. If no, tell the user WebGL is not supported in the browser. done. 2) addEventListener for 'webglcontexterror'. 3) Call canvas.getContext("webgl"). If a valid context is returned, use it. done. 4) When event comes in show user a dialog telling them why they can't use WebGL. done.' In order for all that to work, we need to make the assumption that if media queries for WebGL don't exist, WebGL doesn't exist. > > You should probably talk to the W3C CSS Working Group if you plan to add a new media query. Chris mentioned CSS animations where we've included the media query as part of the spec. In the WebGL case, where the specification isn't at W3C, I think you should make the effort to contact the group - or at least start with a post on www-style. > > It's a shame the HTML5 canvas API doesn't have feature detection built in. Well, I guess it does - if you get a null result the context isn't supported - it's just that you here want more information than that. Are you sure you really need to tell the developer exactly why WebGL is not available? If it was because WebGL is not turned on do you really explain to the user how to enable it? What if they can't? Does it matter if their hardware isn't powerful enough ("please go buy a new computer")? > > My gut feeling is that you don't need a status message. If there is no WebGL, there's no WebGL. That's all that really matters, and that's the way it works for other Web technologies like SVG, CSS3, HTML5,