Should WebGL shader output be adjusted for gamma?
Asked Answered
H

2

12

Should a WebGL fragment shader output gl_FragColor RGB values which are linear, or to some 1γ power in order to correct for display gamma? If the latter, is there a specific value to use or must a complete application be configurable?

The WebGL Specification does not currently contain “gamma”, “γ”, or a relevant use of “linear”, and the GL_ARB_framebuffer_sRGB extension is not available in WebGL. Is there some other applicable specification? If this is underspecified, what do current implementations do? A well-sourced answer would be appreciated.

(Assume we have successfully loaded or procedurally generated linear color values; that is, gamma of texture images is not at issue.)

Hodden answered 1/6, 2012 at 1:26 Comment(0)
S
7

This is a tough one, but from what I've been able to dig up (primarily from this email thread) it seems that the current behavior is to gamma correct linear color space images(such as PNGs) as they are loaded. Things like JPEG get loaded without transformation of any sort because they are already gamma corrected. (Source: https://www.khronos.org/webgl/public-mailing-list/archives/1009/msg00013.html) This would indicate that textures may possibly be passed to WebGL in a non-linear space, which would be problematic. I'm not sure if that has changed since late 2010.

Elsewhere in that thread it's made very clear that the desired behavior should be that everything input and output from WebGL should be in a linear color space. What happens beyond that is outside the scope of the WebGL spec (which is why it's silent on the issue).

Sorry if that doesn't authoritatively answer your question, I'm just digging up what I can on the matter. As for the matter of wether or not you should be doing correction in a shader, I would say that the answer appears to be "no", since the WebGL output is going to be assumed to be linear, and attempting to self correct may lead to a double transformation of the color space.

Seidule answered 2/7, 2012 at 20:47 Comment(5)
You say “…output from WebGL should be in a linear color space”, but also “the WebGL output is going to be assumed to be sRGB”, which seem to contradict each other. Could you clarify?Hodden
Yes, the explanation was that I was typing faster than I was thinking. :) Corrected for consistency. Sorry.Seidule
Thanks. FYI, later messages in that archive (e.g.) suggest that it's actually still an unresolved issue (possibly with a context option in the future), but linearity probably wins because of language like “OpenGL is conventionally linear”. Also, if you wanted to really polish your answer it'd be nice to have the first paragraph last, as per my question I'm worried about output rather than texture loading.Hodden
FWIW, I get much better results when I explicitly add gamma correction both to loading and output. Unfortunately :/Linder
For the time being the answer appears to be "yes" because current browsers don't perform gamma correction on the output. You can test it with some shaders, e.g. shadertoy.com/view/Xdl3DM This means that you should implement gamma correction in your shaders if you perform lighting and you want a perceptually accurate output.Tufthunter
H
3

When I mentioned this question on Freenode #webgl (June 29, 2012), Florian Boesch vigorously expressed the opinion that nearly all users' systems are hopelessly misconfigured with regard to gamma, and therefore the only way to get good results is to provide a gamma option within a WebGL application, as even if WebGL specified a color space (whether linear or non-linear) for framebuffer data, it would not be correctly converted for the monitor.

Hodden answered 2/7, 2012 at 20:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.