I have a scene that I render to a texture (works fine). I keep a copy of that texture, which I use to create a blur in the X and Y direction so now I have:

a) The original texture of the scene

b) A blurred copy of the scene on the device's default rendertarget

I now want to draw the original texture on top, giving it the "blur" around the original scene data. When I do this, however, it just completely replaces everything; my blurred scene is wiped out and the original replaces it entirely, as if StretchRect can do no combining or blending.

I turn on Alpha blending and set the blending mode to Additive (also tried AlphaAdditive):

Render targets have two modes of operation: preserve and discard. They are usually created in discard mode for performance reasons. When this is true, the content of the render target is not longer available after you resolve it (i.e. use it as a texture). You could try and switch to render targets in preserve mode, but like I said, there is a performance overhead. Even worse, since you will actually read back the preserved render target and blend. I believe it also consumes more VRam.

What you are describing looks like a bloom like shader. You could create a shader and use a full screen quad to achieve your desired effect. This shader would take two sampler inputs: the original scene and the blurred scene. You would do the blending yourself in the pixel shader. Basically your entire pixel shader would be just (in pseudo-code): return blend(texture1, texture2);.

Thanks, that's what I've been working on since I posted it (a 'Combine' shader that takes the two blur stages and combines their outputs to the screen). I'm not familiar with the blend() function, so thanks! I had just been adding them with a scale factor.