Happy to report Oolite ran fine, no frame drops or anything, I don't think my 460GTS video card could quite manage enough bits as there was some banding on the planets but game still looks amazing anyway!

Happy to report Oolite ran fine, no frame drops or anything, I don't think my 460GTS video card could quite manage enough bits as there was some banding on the planets but game still looks amazing anyway!

Can confirm that the 460 oculdn't set bit depth to 16 bit. Red, green, blue and componernts are all set to 8 bit according to the log. But that's still alright, since the game managed to start up without issues anyway, even if the requested pixel format was not supported. Thanks for testing, Griff.

Managed to test on an NVidia 960. I noticed that the gamma for the 16 bpcc version was a bit raised compared to the standard 8 bpcc exe. I had to turn the 16bpcc version's gamma down to 0.4 to achieve the same lighting settings that I was getting with the gamma set at 1.0 with the 8 bppc exe. Other than this minor inconvenience (probably more driver related, IMO), it run really really well, with the GPU never getting above 40% even with sky settings pushed high and with Zygo's Cinematic Skies installed.

We need to get some testing done on Linux now. What I think I'm going to do (assuming the Linux test goes without any major setbacks), is leave the 8 bit per color component settings as they are, with the option for the user to select 16 bits per color component if they so wish, using both the command line (-bpcc16 parameter) or the .GNUstepDefaults key bpcc = <whateverBitDepthYouWant>; (the game should be clever enough to fall back to the closest supported setup if someone gives it a crazy value). So the new feature will initially be fully transparent and activated only upon request. Later, if and when it gets confirmed that this is a good way forward, it could become the default option.

It would be nice if we could also find someone with an AMD card to test, just to be sure.

No gamma issues, no performance issues.
Looking at Lave I didn't manage to see color banding in either trunk or 16bpcc binary, though.
Had to hijack a colleague's laptop so I didn't have much time to play around with AMD card settings.

Well, it looks like the AMD 530 did not manage to set a 16-bit bpcc buffer. However... AMD cards by default do something that NVidias don't: it's called dithering and is designed to eliminate banding. This should explain why there were no visible banding issues, even though the bit color component depth was stuck at 8. People have been asking NVidia to enable this on their cards for ages, but NVidia doesn't wanna.

So we should be good in any case on both major card vendors. I think I will make a github pull request to enable the 16 bits per color component as an option soon.