Depth testing in sprite based game with transparency

I'm trying to make a renderer for a simple 2D game, and I'm having trouble understanding the depth buffer, the issue I'm getting is that transparent fragments write to the depth buffer, even though there's nothing there.

My understanding is that it doesn't help me to discard fragments so they don't write to the depth buffer, because that basically disables the depth buffer, and makes it equivalent to just sorting the polygons from furthest to nearest. source, seems to imply that this problem persists even in separate draw calls, maybe frags aren't processed at all until just before the buffer flushes or something, i dunno.

However I don't really understand why, it seems to me that would move writing to the depth buffer writing to after the fragment shader, but the depth test should still work; so I expected sorting from closest to furtherest and discarding to be quicker than not depth testing, but slower than never discarding, but the exact time to not be perfectly correlated with either of those, but this doesn't seem to be the case, instead it seems like discarding leads to more work total.

I'm curious if it doesn't work the way I thought because of some quirk in the standard; or if it's a worse idea than I thought; or if it just doesn't work that way in hardware, just because.

Anyway, after I render everything with 1 bit alpha, i need to render the rest of the stuff, and that requires a functioning depth buffer to have been created. I considered writing to the depth buffer directly then using the == depth test, but I think that probably just pushes the problem back a step instead of fixing anything. is there a way to do the aforementioned where the depth test basically functions, and get a performance boost, or is taking a hit just how it works in sprite based games?

I'm trying to make a renderer for a simple 2D game, and I'm having trouble understanding the depth buffer, the issue I'm getting is that transparent fragments write to the depth buffer, even though there's nothing there.

Often times, you would write your opaque objects first, with depth writes and depth tests.

Then when you render your translucent objects, you'd sort them back-to-front and render them back-to-front with depth test but no depth writes. That avoids your problem. That way, your opaque objects will occlude and clip the bounds of your translucent objects, but your translucent objects won't be required to 100% occlude anything (though they can, depending on the alpha value). The back-to-front order ensures that the translucents blend reasonably with each other (if polys don't interpenetrate, and your depth sort is fully correct).

Now, if there are some odd cases where you want a translucent object to occlude what's behind it as if it was 100% opaque (e.g. full depth sorting of all of your triangles is just not reasonable), in that limited case you might want to let that translucent not only depth test but depth write as well. In the case where your spite has textured or procedural alpha where some parts are "opaque" and some parts are not, it's only this case where you might want to use discard for transparent or near-transparent fragments. However, beware edge artifacts if you do (particularly when not rendering with supersampling). An alternate approach to consider in this case is using alpha-to-coverage.

If you want your sprites to fade-out as if they represented 3D geometry, then take a look at soft particles.