Cinema5D have put the Panasonic GH5 on test, with a few charts and have some harsh words for the new 10bit mirrorless camera.

I just don’t agree with much of their criticism.

Some highlights from the Cinema5D test:

“Skin-tones are a particular problem (in V-LOG)”

“Something wrong with V-LOG on the GH5”

“Very clear low bit-depth artefacts”

What Cinema5D are seeing is not a serious “problem with bit-depth” as they put it, it’s just compression and a very poor grade. Perfectly normal and the Sony FS5 initially had macro-blocking in 10bit far worse than what I see with the Panasonic GH5.

Secondly, without the LOG gamma curve, the Panasonic GH5 shows a clear improvement thanks to 10bit in the normal colour profiles with superb skintones and gradation. What is strange is that Cinema5D themselves just demonstrated that half way down their page, but then concluded “there is actually little to benefit in shooting 10-bit on the Panasonic GH5 whatsoever. A pity!”

No benefit whatsoever? Really?

I think it is time to stand up for the GH5. Cinema5D were using pre-production hardware. Why publish the test this early?

There follows shortly after the camera’s release two major firmware updates, one which increases the bitrate of 10bit to 400Mbit, using ALL-I compression which should clean up the problems Cinema5D wrongly put down to 10bit.

An extremely flat image produced by a LOG gamma curve is itself a form of compression. LOG mashes skintones together, with very little separation between distinct shades, so that when the 150Mbit/s data rate is applied to the final H.264 file, the compression becomes the important factor (mushing the tonality and blending blocks of colour together as one). It is NOT a bit-depth issue as Cinema5D says it is. 10bit is 10bit, simple as that. It’s capable of billions of colours but if these shades are so incredibly similar that the H.264 compression cannot tell them apart, how is that a problem for 10bit?

Then their grade pushes the codec too far, yet they are surprised to see problems. A 10bit codec with a LOG curve is not the same as 14bit RAW. I wonder if Cinema5D expect the same performance.

To my eye from their own test frames, the $2000 GH5 is standing up extremely well vs the other cameras in the test which aside from the Canon 1D C, are 4 to 10 times more expensive!! The Canon C700? Give me a break.

Also unclear from the Cinema5D review is where the HDMI output figures in all of this. This will eliminate some of the H.264 issues when recording 10bit to an external recorder. The LOG curve and YUV 4:2:2 sampling would still compress the image, but the so-called ‘uncompressed’ HDMI output is at least stored at a very high bit-rate once it hits ProRes on the SSD.