I have some trouble rendering a bunch of values to a rendertarget. The values never end up in the exact range I want them to. Basically I use a fullscreen quad and a pixel shader to render to my rendertarget texture and then intend to use the texture coordinates as basis for some calculations in the shader. The texture coordinates of the quad range from (0,0) in the top left to (1,1) in the bottom right corner... the problem is, after interpolation, these values don't arrive at the pixel shader as such.

An example:
I render to a 4x4 texture and the shader in this case simply outputs the texture coordinates (u,v) in the red and green channels:

return float4(texCoord.rg, 0, 1);

What I want to get out of it is a texture where the pixel in the top left is RGB (0,0,0) and thus black, the pixel in the top right is RGB (255,0,0) and thus a bright red and the pixel in the bottom left is RGB (0,255,0) - a bright green.

However, instead I get this here on the right:

(straight quad rendering, no correction)

The top left pixel is black, but I only get a relatively dark red and dark green in the other corners. Their RGB values are (191,0,0) and (0,191,0). I strongly suspect it has to do with the sampling locations of the quad: the top left pixel correctly samples the top left corner of the quad and gets (0,0) as UV coordinates, but the the other corner pixels don't sample from the other corners of the quad. I have illustrated this in the left image with the blue box representing the quad and the white dots the upper sampling coordinates.

Now I know about the half pixel offset you should apply to your coordinates when rendering screen-aligned quads in Direct3D9. Let's see what kind of result I get from this:

(quad rendered with DX9's half pixel offset)

Red and green have gotten brighter but still aren't correct: 223 is the maximum I get on the red or green color channel. But now, I don't even have pure black anymore, but instead a dark, yellowish gray with RGB(32,32,0)!

What I actually need would be this kind of rendering:

(target render, reduced quad size)

It looks like I have to move the right and the bottom border of my quad exactly one pixel up and to the left, compared to the first figure. Then the right column and the bottom row of pixels all should correctly get the UV coordinates right from the border of the quad:

However this didn't quite work out and caused the bottom and right pixels to not render at all. I suppose these pixels' centers aren't covered by the quad anymore and thus won't get processed by the shader. If I change the pixelSize calculation by some tiny amount to make the quad a tiny amount bigger it kinda works... at least on a 4x4 texture. It doesn't work on smaller textures and I fear it subtly distorts the even distribution of uv values on bigger textures as well:

(I modified the pixelSize calculation by 0.001f - for smaller textures, e.g. 1D lookup tables, this doesn't work and I have to increase it to 0.01f or something bigger)

Of course this is a trivial example and I could do this calculation much more easily on the CPU without having to worry about mapping UVs to pixel centers... still, there has to be a way to actually render a full, complete [0,1] range to pixels on a rendertarget!?

It doesn't look like the coordinate system you're using matches the screen's pixels— if it were, then it looks like you're only drawing 2×2 pixels here. The first issue I'd solve is getting the project and modelview matrices scaled such that +1 in coordinate space is a shift of 1 pixel on screen.
–
Slipp D. ThompsonApr 6 '13 at 1:56

3 Answers
3

Your problem is that UVs are designed to be texture coordinates. A coordinate of 0,0 is the top left corner of the top left pixel in the texture, which is not where you normally want to read the texture. For 2D you want to read the texture in the middle of that pixel.

To elaborate on the last sentence: you could apply this transformation to the UVs in the vertex buffer for the four corners of the quad, then in the pixel shader just do return float4(texCoord.rg, 0, 1);.
–
Nathan ReedMay 9 '12 at 22:00

I tried the suggested formula above and it didn't quite work: in my 4x4 sample rendertarget from above I get 0, 42, 127 and 212 in the R and G channels, from left to right or top to bottom. However, I'd want values from 0 to 255 in evenly spaced steps (for a 4x4 texture that'd be 0, 85, 170 and 255). I also tried to alter the UV coordinates, but didn't quite find the right offset yet.
–
MarioMay 14 '12 at 17:15

This is certainly being caused by linear sampling. If you look at the texels to the right and below the full-black top-left pixel, you'll see that they have 63 in the R and/or G channels, and 2 of them have 2 in B. Now look at the muddy-dark-yellow you get; it's 31 in R and G and 1 in B. That's definitely a result of averaging texels over a 2x2 group, so the solution is to set your texture filter to point.