When to call glEnable(GL_FRAMEBUFFER_SRGB)?
Asked Answered
M

1

20

I have a rendering system where I draw to an FBO with a multisampled renderbuffer, then blit it to another FBO with a texture in order to resolve the samples in order to read off the texture to perform post-processing shading while drawing to the backbuffer (FBO index 0).

Now I'd like to get some correct sRGB output... The problem is the behavior of the program is rather inconsistent between when I run it on OS X and Windows and this also changes depending on the machine: On Windows with the Intel HD 3000 it will not apply the sRGB nonlinearity but on my other machine with a Nvidia GTX 670 it does. On the Intel HD 3000 in OS X it will also apply it.

So this probably means that I'm not setting my GL_FRAMEBUFFER_SRGB enable state at the right points in the program. However I can't seem to find any tutorials that actually tell me when I ought to enable it, they only ever mention that it's dead easy and comes at no performance cost.

I am currently not loading in any textures so I haven't had a need to deal with linearizing their colors yet.

To force the program to not simply spit back out the linear color values, what I have tried is simply comment out my glDisable(GL_FRAMEBUFFER_SRGB) line, which effectively means this setting is enabled for the entire pipeline, and I actually redundantly force it back on every frame.

I don't know if this is correct or not. It certainly does apply a nonlinearization to the colors but I can't tell if this is getting applied twice (which would be bad). It could apply the gamma as I render to my first FBO. It could do it when I blit the first FBO to the second FBO. Why not?

I've gone so far as to take screen shots of my final frame and compare raw pixel color values to the colors I set them to in the program:

I set the input color to RGB(1,2,3) and the output is RGB(13,22,28).

That seems like quite a lot of color compression at the low end and leads me to question if the gamma is getting applied multiple times.

I have just now gone through the sRGB equation and I can verify that the conversion seems to be only applied once as linear 1/255, 2/255, and 3/255 do indeed map to sRGB 13/255, 22/255, and 28/255 using the equation 1.055*C^(1/2.4)+0.055. Given that the expansion is so large for these low color values it really should be obvious if the sRGB color transform is getting applied more than once.

So, I still haven't determined what the right thing to do is. does glEnable(GL_FRAMEBUFFER_SRGB) only apply to the final framebuffer values, in which case I can just set this during my GL init routine and forget about it hereafter?

Maki answered 8/7, 2012 at 19:59 Comment(0)
S
28

When GL_FRAMEBUFFER_SRGB is enabled, all writes to an image with an sRGB image format will assume that the input colors (the colors being written) are in a linear colorspace. Therefore, it will convert them to the sRGB colorspace.

Any writes to images that are not in the sRGB format should not be affected. So if you're writing to a floating-point image, nothing should happen. Thus, you should be able to just turn it on and leave it that way; OpenGL will know when you're rendering to an sRGB framebuffer.

In general, you want to work in a linear colorspace for as long as possible. Only your final render, after post-processing, should involve the sRGB colorspace. So your multisampled framebuffer should probably remain linear (though you should give it higher resolution for its colors to preserve accuracy. Use GL_RGB10_A2, GL_R11F_G11F_B10F, or GL_RGBA16F as a last resort).

As for this:

On Windows with the Intel HD 3000 it will not apply the sRGB nonlinearity

That is almost certainly due to Intel sucking at writing OpenGL drivers. If it's not doing the right thing when you enable GL_FRAMEBUFFER_SRGB, that's because of Intel, not your code.

Of course, it may also be that Intel's drivers didn't give you an sRGB image to begin with (if you're rendering to the default framebuffer).

Slaton answered 8/7, 2012 at 21:21 Comment(2)
Well, here's the thing. When I enable GL_FRAMEBUFFER_SRGB at initialization and don't touch it forever after, even on the Intel HD 3000 in Windows it will do the correct transformation. It's just that it seems to require it being enabled for more of the code path than on Nvidia.Maki
What you said makes a lot of sense that the sRGB format of buffers should determine whether sRGB values are written to them. I will test this to see if this is implemented correctly, and it will definitely allow me to get a lot more mileage out of 8 bits per channel. I hope that the multisample resolution step does this correctly as well: I had encountered incorrect multisample resolve results before but I may have been working in the wrong mode at that time. (see this topic opengl.org/discussion_boards/showthread.php/…)Maki

© 2022 - 2024 — McMap. All rights reserved.