If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Video signals have red, green, and blue channels with values that range from 0% to 100%. 0% is the darkest the display can display (black), and 100% is the brightest. So a signal value of (100%, 100%, 100%) for example is full white, and (100%, 0%, 0%) is the brightest, most saturated red that the display can produce. The most common digital encoding of video signals is the one defined in ITU recommendations BT.601 and BT.709, and some SMPTE standards that preceded them. The digital encoding uses 8 bits in each sample and is capable of representing a black as dark as -7.3% and a white as bright as 109%. Of course any values outside of 0-100% are meaningless to a display. They just get clamped to the 0-100% range. But cameras will often record values outside of the 0-100% range. Out-of-gamut colors will result in values below 0%, and bright colors can result in values above 100%. Those are the so-called "superwhites", though they needn't be white. You could have (-5%, 105%, -6%), for example, which is a very bright and saturated green.

Canon Log is a little different from a standard video signal because it's scene-referred, not display-referred. It describes the brightness of the image in terms of what the sensor saw instead of what the display should display. But Canon Log still comes out of the camera as a video signal. Just it uses the ranges in a non-standard way. Black in Canon Log is at 7.3%, and the whites go up to 109%. Canon didn't need to use up to 109%. They could have stopped at 100%. It was their choice to go up to 109%. Canon Log is different from a standard video signal for several other reasons: it uses a different opto-electronic transfer function ("gamma curve"), and a weird polynomial transformation that they call the Canon Log Color Matrix. You can plug that Canon Log signal straight into a monitor and view it. You'll see an image, but the colors won't be quite right. Canon Log really needs some transformation before it can be viewed correctly. That transformation may read Canon Log values in the 100% to 109% range.

There's another common digital encoding of video signals: the full range encoding defined in the H.264 video compression standard. In a full range encoding, 0% is the minimum and 100% is the maximum. A full range encoding doesn't have superwhites. The BT.601/BT.709 digital encoding is universally supported and far more popular. It's used by virtually every codified video standard, including D1, D5, DV, AVCHD, DVD, Blu-ray, ATSC, and DVB. Every video application that can read video supports that encoding. The full range encoding is much less popular. In H.264 video, there's a flag that says if the video has a full range encoding. But the flag is in an optional part of the H.264 specifications. Some applications are smart enough to read the flag and properly decode the video, but a lot of applications don't read the flag at all, or support it inconsistently. Quicktime, Final Cut Pro, YouTube, and ffmpeg read the flag. Premiere Pro and DaVinci Resolve read the flag inconsistently, depending on the platform and file format. Vegas Pro and I venture most other apps don't read the flag at all. When you read full range video in an application that doesn't read the flag, the video levels are all wrong. What should be 100% comes out as 109%. What should be 0% comes out as -7.3%. If you encounter this problem, you can compensate by scaling the RGB values in your video application. It's not a perfect compensation, since the real mismatch in levels happens in the Y, Cb, and Cr channel values, not in the RGB values. But it's close enough for most people. Professional camcorders almost all avoid the full range problem by using the more common BT.601/BT.709 digital encoding. iPhones and Canon DSLRs use a full range encoding. I think it was a rather poor decision on the part of Apple and Canon in using a full range encoding. They sacrifice a lot of software compatibility just for a tiny gain in precision.

Anyway, five years after Canon introduced Canon Log, they decided that having color values above 100% was a problem, since some people don't know how to access those color values. So they changed the C300mk2 firmware to set the full range flag in the video files recorded internally by the camera, even though the camera does not use a full range encoding. For the video applications that know how to read the full range flag, this change has a side effect of bringing the levels below 100%, instead of going up to 109% like Canon Log is supposed to. Canon may have thought they were fixing something, but in reality they were causing a whole bunch of problems. The new Canon Log footage recorded internally recorded by C300mk2 now won't match footage recorded externally, footage recorded by the same camera with earlier firmware versions, and footage recorded by the C100, C100mk2, C300, or C500. And this difference only appears in the applications that actually read the full range flag.

So now your LUTs or your workflow for Canon Log can change depending on which camera you used, whether it was recorded internally or externally, which format you recorded in, whether the C300mk2 was updated with the latest firmware or not, and depending on how your video application treats full range files for the format you chose. That's the mess that Canon has created with this latest change. Before that, everything just worked. You'd get the same levels from every camera in every application.

Thank you so much for that explanation. I might not be able to predict the problem, but I think I will have a better understanding of why it is occurring and where to start to fix it. I also hadn't thought about the fact that I monitor and grade in RGB and the encoding is occurring in Y, Cb, and Cr. It would be nice if Canon also changed their LUT's and LCD display so that the WFM and interpretation was also pushed down to 100%.

Does anybody know if Premiere is pushing the superwhites to 100% on the C300II? Most of my clients are using Premiere and I would like to know what to tell them ahead of time.

All I would say is, look at your waveform. If there is anything above 100, bring that top part down to under 100.

It's really not that simple. If you see levels above 100%, you know the footage was decoded correctly. But you don't always want to pull the levels below 100%. It's a creative decision that depends on the scene and how it was exposed. Clipping isn't always bad.

And if the levels aren't above 100%, it doesn't tell you anything about whether it was decoded correctly or not. Some scenes just don't have bright parts.

The full range encoding is much less popular. In H.264 video, there's a flag that says if the video has a full range encoding. But the flag is in an optional part of the H.264 specifications. Some applications are smart enough to read the flag and properly decode the video, but a lot of applications don't read the flag at all, or support it inconsistently.

The problem is not just the flag.

For instance Premiere Pro interprets 8-bit full range video from Panasonic cameras as Rec.601 instead of Rec.709 due to the fact that Panasonic describes the pixel format as yuvj420p which was originally based on a Rec.601 specification. The effect will be that if you load those videos in Premiere your colors will be off and they will be hard to correct.

cpreston, C-log 2 clips at 92 IRE. Other flavors of C-log clip at different points. I'm most familiar with C-log1 which clips, like any other gamma on the original C cameras, at 110. There are two things to consider here: Where does the camera at your given settings clip highlights while recording, and then if you're delivering to a normal rec709 monitor or TV, how to deliver your final video between 0 and 100 IRE. Yes, you can clip values above 100 if you want. But conversely, if you'd like to pull a bit more detail in the highlights, you can pull down levels below 100.

balazer, my tests were made when Premiere v1.0.6.1.00 was the current version. Back then, Premiere was not interpreting the levels correctly for the 444 modes on the C300/II. I believe it is fixed now with newer versions of Premiere.

ETA: Sorry, meant to say, this was happening in the first version of Premiere cc2017. It seems to be working correctly in the current version of Premiere as of May 24.