Double buffering is a rendering technique where the image currently being rendered is held in a buffer and you are shown an old image until the new one is complete. After completing, the buffers swap so you see the new image and the old image is removed with a newer image being drawn in its place. Basically, this stops you from seeing the image as it's being rendered piece by piece.
Triple buffering is the same thing, but with an extra buffer. This increases performance by allowing the unused buffer to start drawing after the new image is being swapped with the old image. This increases performance because the swapping of images takes a certain amount of time, thus creating lag.

I assume you're talking about CCC, so, this will only affect programs that use OpenGL.

Some games have an in-game setting to set triple buffering. It's best to enable it from there, when possible.
Also, note that triple buffering increases the input lag a little. When playing games that don't require extreme responsiveness (e.g.: MMOs, RPGs, among others), it's completely unnoticeable, thus having it on is usually a Good Thing™ there. On the other hand, when playing games that do require extreme responsiveness (competitive shooters, mostly), it is best to avoid it, as the input lag becomes noticeable and really annoying at times.

Screen data always comes from a buffer. GCs render to a buffer, never directly to the screen.
That would be a disaster fron an efficiency standpoint.

The front buffer is what is sent to the screen when a refresh event occurs (the speed of this is dependant upon the refresh rate of your monitor).

The back buffers (2nd and 3rd) are independent of the refresh timing so the GC can fill them as fast as it possibly can. The buffers are then moved to the front for display when needed.
There is no theorectical limit as to how many back buffers a game (or whatever) can use, but more than 3 buffers is rarely used. This eliminates or reduces flicker, tearing and other artifacts.

Quad buffering in 3D is sometimes used (double buffering for each eyes image), but I've never seen a game that is not 3D use more than 3.

To say that an FPS game benefits from a single buffer is nonsense. Any modern GC can fill two buffers much faster than even the fastest monitors can display them, and the buffer transfers within the GC memory are in the sub millisecond range. If the game employs good algorithms for buffer fills, the GC could easily refresh the back buffer before it's needed if it had to do so.

As was stated, though, tripple buffering may not be of much benefit in some cases.

The basic issue with Vsync is that you FPS is tied to your display's refresh frequency. Most displays these days have 60Hz, hence your FPS will be 60 FPS.
The trouble starts when your GPU cannot sustain rendering 60fps. In this case FPS will be cut in half, and drop all the way to 30fps (60/2=30) (which is pretty bad).

Triple buffering allows to use hardware more efficiently. If your GPU could render, say: 50 fps, you'll see benefit of it (it won't drop immediately to 30fps).

That comes at a cost of increased input lag (which is quite noticeable, and prevents any competitive shooter gaming).

When you've got a good GPU, a very good idea is to get a "3D" monitor (or projector in my case).
These displays support 120Hz. That makes Vsync work at beautifully fluid 120fps, and when you GPU can't sustain it, it drops to 60fps, which is still great.

To say that an FPS game benefits from a single buffer is nonsense. Any modern GC can fill two buffers much faster than even the fastest monitors can display them, and the buffer transfers within the GC memory are in the sub millisecond range. If the game employs good algorithms for buffer fills, the GC could easily refresh the back buffer before it's needed if it had to do so.

Click to expand...

It's a little more tricky than that. When fps are lower than refresh rate, sometimes you'll see frames that are 2 frames old, and at 60 Hz that lag suposes 33 ms, which is quite a lot for competitive play. For older games like CSS running at 300 fps on modern cards that's not a problem but for, say Battlefield 3, it can be very noticeable.

It's a little more tricky than that. When fps are lower than refresh rate, sometimes you'll see frames that are 2 frames old, and at 60 Hz that lag suposes 33 ms, which is quite a lot for competitive play. For older games like CSS running at 300 fps on modern cards that's not a problem but for, say Battlefield 3, it can be very noticeable.

Click to expand...

I understand that, but if your GC can't handle a single buffer at 60Hz, then you either need to drop the eye candy or get a better card.
I know that some of the games are GC killers, for whatever reason, and that everyone wants to run them at max, but that ain't reality in many cases.

Triple buffering allows to use hardware more efficiently. If your GPU could render, say: 50 fps, you'll see benefit of it (it won't drop immediately to 30fps).

That comes at a cost of increased input lag (which is quite noticeable, and prevents any competitive shooter gaming).

When you've got a good GPU, a very good idea is to get a "3D" monitor (or projector in my case).
These displays support 120Hz. That makes Vsync work at beautifully fluid 120fps, and when you GPU can't sustain it, it drops to 60fps, which is still great.

Click to expand...

So confused, will Triple Buffering hinder me in competitive online FPS or not?