You should do all calculation in linear space and only convert it while reading it from the texture and writing it back to the framebuffer, and the current hardware is able to handle sRGB textures automatically (no need to transform it yourself), further infos are found here.

For testing purpose I would render three bars (greyscale 0...100%):

1. bar: linear texture (e.g. tga)

2. bar: sRGB texture (use sRGB texture format)

3. bar: shader which generate linear ramp

All bars should look the same and should be comparable to an according color ramp image seen in a browser/image processing tool.

So when using a higher precision format like R16G16B16A16 it doesn't matter ? (meaning I don't have to manually do anything during read/write)?
Since I've only seen the LDR format as a separate SRGB format.

16bit formats, holding linear data, have enough precision that sRGB encoding of the data isn't required which is why you don't see sRGB versions of those formats.<br /><br />So if you are writing from your shader to a 16bit float target you don't need to encode to sRGB and thus don't need to decode; the only time you'll want to goto sRGB is when writing to an 8Bit/channel format.