Wrong framebuffer rendered to screen

Hey guys,
I have a teeny tiny problem with marrying my openGL application and the Oculus Rift. Since it's an openGL problem, I thought I might as well ask here.
To those who might not be familiar, the Oculus is a head mounted display (besides other stuff). Since the Rift's screen is viewed through convex lenses, the rendered image has to be distorted ("barrel distortion") to compensate for these lenses.
I have successfully implemented the neccessary shaders, I am rendering to a framebuffer (from two viewports, to simulate the view through two separate eyes) and I'm trying to get that framebuffer to the screen, but distorted.
My results can be seen in the images:
As you can see in the first pic, I am getting both my viewpoints in the undistorted framebuffer (which I disabled to take this picture, but that doesn't matter afaik).
In the distorted image however, I have only one central view. How could that be?
As to code, I got

I took a lot of the code from this example, for example the shaders start at line 536.
If any "guru" might have the slightest idea as to why only one of my viewports is distorted, please let me know. Thanks!