The thing i want to do is a combination of "conventional rendering" (say render something into render target) and overlay raytraced image.
I have a raytracing kernel which outputs PBO which I can draw later as a texure, i also can generate depth for any point in my ray-traced image, so i can get depth buffer with floatind point depth values. Then i want to draw let's say a sphere or box over my ray-traced image but i want my sphere or box to account for depth which my ray-traced image has. So the question is - how can i put my float point depth values to depth buffer (GL_DEPTH_COMPONENT render buffer)

Am I right assuming, that i can do something like this:
glDrawBuffer(GL_DEPTH_ATTACHMENT);

The color outputs from fragment shaders are color outputs. That's why it's called gl_FragColor. You write to the depth buffer using gl_FragDepth. So do that.

You attach the depth buffer to the GL_DEPTH_ATTACHMENT, set the draw buffer to GL_NONE (because you're not writing any colors), and then render as normal. You only write to gl_FragDepth.

Though really, it'd probably be quicker to just upload your values directly to the depth buffer with glTexSubImage2D. Though it would help if you put your depth values in 24-bit unsigned integer format first, so that OpenGL doesn't have to do the float-to-integer conversion. Alternatively, you could simply use GL_DEPTH_COMPONENT32F as a format, so you can upload floating-point values directly without conversion.

11-07-2012, 11:54 PM

alariq

Yeah it is actually gl_FragDepth :-)
I agree, directly converting to 24bit int values could be faster.
Thanks very much for help.