OpenGL, how to use depthbuffer from framebuffer as usual depth buffer
Asked Answered
L

3

13

I have frame buffer, with depth component and 4 color attachments with 4 textures

I draw some stuff into it and unbind the buffer after, using 4 textures for fragment shader (deferred lighting). Later i want to draw some more stuff on the screen, using the depth buffer from my framebuffer, is it possible?

I tried binding the framebuffer again and specifying glDrawBuffer(GL_FRONT), but it does not work.

Linares answered 28/3, 2012 at 19:5 Comment(0)
S
15

Like Nicol already said, you cannot use an FBOs depth buffer as the default framebuffer's depth buffer directly.

But you can copy the FBO's depth buffer over to the default framebuffer using the EXT_framebuffer_blit extension (which should be core since GL 3):

glBindFramebuffer(GL_READ_FRAMEBUFFER, fbo);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, 0);
glBlitFramebuffer(0, 0, width, height, 0, 0, width, height, 
                  GL_DEPTH_BUFFER_BIT, GL_NEAREST);

If this extension is not supported (which I doubt when you already have FBOs), you can use a depth texture for the FBO's depth attachment and render this to the default framebuffer using a textured quad and a simple pass through fragment shader that writes into gl_FragDepth. Though this might be slower than just blitting it over.

Samala answered 28/3, 2012 at 19:53 Comment(0)
D
12

I just experienced that copying a depth buffer from a renderbuffer to the main (context-provided) depth buffer is highly unreliable when using glBlitFramebuffer. Just because you cannot guarantee the format does match. Using GL_DEPTH_COMPONENT24 as my internal depth-texture-format just didn't work on my AMD Radeon 6950 (latest driver) because Windows (or the driver) decided to use the equivalent to GL_DEPTH24_STENCIL8 as the depth-format for my front/backbuffer, although i did not request any stencil precision (stencil-bits set to 0 in the pixel format descriptor). When using GL_DEPTH24_STENCIL8 for my framebuffer's depth-texture the Blitting worked as expected, but I had other issues with this format. The first attempt worked fine on NVIDIA cards, so I'm pretty sure I did not mess things up.

What works best (in my experience) is copying via shader:

The Fragment-Program (aka Pixel-Shader) [GLSL]

#version 150

uniform sampler2D depthTexture;
in vec2 texCoords; //texture coordinates from vertex-shader

void main( void )
{
    gl_FragDepth = texture(depthTexture, texCoords).r;
}

The C++ code for copying looks like this:

glDepthMask(GL_TRUE);
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glEnable(GL_DEPTH_TEST); //has to be enabled for some reason
glBindFramebuffer(GL_FRAMEBUFFER, 0);
depthCopyShader->Enable();
DrawFullscreenQuad(depthTextureIndex);

I know the thread is old, but it was one of my first results when googeling my issue, so I want to keep it as consistent as possible.

Dela answered 23/6, 2013 at 20:11 Comment(4)
Thanks, solved the problem with my integrated intel graphics.Christ
Is there a way to get the default buffer format after its created and use to create the fbo?Luben
I also wonder the same thing as @LukeB. as it would allow blitting between two framebuffers knowing that the formats match. That would also work for the case when we want to blit from the default framebuffer which we can't do with a shader, since it's depth buffer is not accessable.Glasgo
Had the same problem.Unshackle
T
2

You cannot attach images (color or depth) to the default framebuffer. Similarly, you can't take images from the default framebuffer and attach them to an FBO.

Toughminded answered 28/3, 2012 at 19:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.