Problem accessing MRT samples in deferred shading

Hello,

I've played around with deferred shading recently and during several optimization approaches I've tried to pack additional values like specular or ambient factors in already existing buffers.
So I came up with the Idea of using the fourth alpha-channel in my normal buffer for that as suggested by several papers. The Problem however is that whenever I try to access the alpha channel of the normal buffer to light up some pixels, the whole scene gets partially transparent. It seems that the deferred fragment shader mixes alpha values from different buffers (normal and albedo).

is your shader compiling correctly? double check it. everything goes semi-transparent or randomly colored if your shader failed to compile. and you should post full shaders with the code where you supply uniforms.

also,

lo_fragcolor2 = vec4(normalize(gs_normal) * 0.5 + 0.5, gs_ambient.r);

you are using float render target. it supports negative values. so there's no need for "* 0.5 + 0.5".

hi,
well I assume that the shader compiles correctly. glGetShaderiv() always returns a good status and there are no warnings whatsoever. And there is no random "jittering" in the scene. the depth issue remains constant. I also checked the shader on two different platforms (i965 and nvidia 8600) both show the same output (screenshot attached).
Ok, I'm going ahead and post some more details here:

I've played around with deferred shading... came up with the Idea of using the fourth alpha-channel in my normal buffer ... whenever I try to access the alpha channel of the normal buffer to light up some pixels, the whole scene gets partially transparent.

When rasterizing your G-buffer, ensure that not only ALPHA_TEST but BLEND and SAMPLE_ALPHA_TO_COVERAGE as well is disabled, as you're "overloading" what the pipeline would otherwise do with alpha (translucency processing).

Then provide a debug mode where you just read the G-buffer and blast the specified channel to the screen so you can verify what's in there is correct. ...before you try to debug the lighting output.

you have blending enabled. and you are writing values to alpha channel. see the problem? can you imagine, what happening? yes, your normal-map texture, generated by geometry pass is blended using that value as a factor.

you don't really need blending for G-Buffer pass. you will handle semi-transparent objects different way. but you can still use alpha-channel for binary transparency using conditional discard;(it will work for foliage, for example, or rugged cloth.)