I've used glClipPlane in rendering reflections in the past. The usual deal, flipping the scene, setting a clipping plane, rendering to FBO, etc -- it worked fine.

Now, I'm refactoring how I do this to make it more generic, and have encountered a baffling problem. The clip plane "slides" with the camera! It's as if I'm putting it into the projection matrix ( or something to that effect, I'm totally confused ).

It can be best described by this video I recorded:
ClippingPlaneWoes.mov (http://shamyl.zakariya.net/etc/ClippingPlaneWoes.mov)

Finally, I use a macro 'glError' sprinkled around to dump gl errors at runtime ( it's only enabled for debug builds ) and usually I can catch mistakes like matrix underflow from too much popping and other problems. But I'm not seeing any errors...

Observing the clip plane sliding around, you'll notice it is always directly in line with the camera -- e.g., you're looking straight down the clipping plane. This implies to me it's not being multiplied by the current modelview to me. That's the best I can come up with.

Unfortunately, without that line I get the behavior which brought me here in the first place. Damned if I do, damned if I don't.

zed

01-07-2008, 02:41 PM

surely u can do the per fragment test yourself in the fragment shader
and trigger a discard if the fragment is on the other side of the plane

-NiCo-

01-07-2008, 02:52 PM

surely u can do the per fragment test yourself in the fragment shader
and trigger a discard if the fragment is on the other side of the plane

IMO performing the clipping on the fragment level will most likely result in a performance hit compared to the vertex level clipping...

zed

01-07-2008, 04:15 PM

most likely but its certainly gonna be much faster than if the card reverts to a software driver

the test aint slow (well i suppose it does have a conditional :( )

if ( dot( vert,plane_normal) < plane_dist )
discard;

Relic

01-09-2008, 04:54 AM

Note that many Radeon's do not support gl_ClipVertex and fall back to software mode when you use it. On radeon's, you don't have to use gl_ClipVertex to make clip planes work.

The official OpenGL 2.1 spec wording on this is in chapter 2.12 page 53:

"When a vertex shader is active, the vector ( xe ye ze we )T is no longer computed. Instead, the value of the gl_ClipVertex built-in variable is used in its place. If gl_ClipVertex is not written by the vertex shader, its value is undefined, which implies that the results of clipping to any client-defined clip planes are also undefined.
The user must ensure that the clip vertex and client-defined clip planes are defined in the same coordinate space."

Komat

01-09-2008, 10:00 AM

The official OpenGL 2.1 spec wording on this is in chapter 2.12 page 53:

"If gl_ClipVertex is not written by the vertex shader, its value is undefined, which implies that the results of clipping to any client-defined clip planes are also undefined."

This is the official behavior. ATI defines (in its programming guide) that when ftransform is used on the ATI hw, this undefined behavior is equivalent to fixed function clipping.

TomorrowPlusX

01-11-2008, 04:05 PM

As a followup, I got it working using oblique frustum culling. So, glClipPlanes are out for me.