Likes

Search

FCPX color science - LUTs and EV

While working on some codec stuff, I am curious how does the color science actually work in FCPX.I know this tool is not suited for grading, but I miss some base consistency there.

I have noticed, that for a MOV file, there are two essential settings - which I would like to understand. How are these applied to the data and whats the purpose here? Settings being: "RAW to Log conversion" "Camera LUT"

Practical test - I have both a DNG still, with linear raw data and a compressed raw equivalent of it:

- having this linear DNG dropped on the timeline, it looks "okay" (as it would on base exposure... image is dark, contrasty, but unexpectedly saturated*).

- the MOV file then contains a metadata tag, that tells the decoder to apply a certain gain (EV correction?), so when LUT settings are None/None, the result is the look of the DNG plus the exposure compensation applied

I see no control to adjust EV - nor for DNG and not for MOV (or they are hidden somewhere..? I am not a FCPX user) - in order to bring the two clips into same exposure and be able to compare the eventual saturation difference.

But originally when the MOV clip is dropped to the timeline, both LUTs are preset to Panasonic V-log and the output seems to be - the exposure compensated picture tone-mapped to a "random" color space. Yes, the dynamic range fits within the screens capabilities.. but otherwise it does not make sense. Setting one of the LUTs to None while keeping the other the image looks really bad (too contrasty, too washed out).

My question is - why setting both to VLOG produce a different look than when setting to None/None? And what math is actually happening here?Unfortunately there is no such setting for the DNG clip, so I can not "Vlogize" it - it always stays in true/natural color rendering (or where it was automagically developed).

*) saturated - the image of my DNG in FCPX looks the same as in Finder/Preview - likely being developed with some magical setting to just look great. It has a slight EV of about 1 stop and definitely increased saturation. When I develop the image myself by straight-forward math or by regular tools, it definitely does not look this way.

I can't really answer your questions but I will point out 2 things that came to mind when I read your post. Maybe you know these things and you've set-up your Library and Project properly in FCP X.

>>But originally when the MOV clip is dropped to the timeline, both LUTs are preset to Panasonic V-log and the output seems to be - the exposure compensated picture tone-mapped to a "random" color space.

If a V-log LUT is being applied by default its either because the software is reading that in the clip's meta-data, or just as likely, the software decided mysteriously to apply that LUT whether or not the footage came from a Varicam.

As for "random" color space we need to back-up a bit. When you open up FCP X it automatically creates a library that will contain your rushes. Libraries can either be created in Rec 709 color space or Wide Gamut, whatever the hell that is. It sounds bigger than Rec 709 and it is. Whenever I'm importing rushes shot in Rec 2020, I use the Wide Gamut setting for the Library and the colors map better. So you need to check whether your Library is in Wide Gamut. Then you need to create a "project" which is FCP X's language for a timeline. When you create the timeline, you also need to check whether it is in Wide Gamut.

Now I haven't worked with RAW footage in FCP X so I cannot help you anymore than that but I would think that the Library and the project both need to be in Wide Color Space if you want to judge the results on a 10 bit monitor in a wider color space than 709.

>>I see no control to adjust EV

There isn't any in the panel that shows you what LUT is or isn't being applied. You'd have to go into the effects panel and initiate color correction to effect any change on the clip in the timeline.

On 6 Oct 2018, at 22:12, Daniel Rozsnyó <daniel@...> wrote:While working on some codec stuff, I am curious how does the color science actually work in FCPX.I know this tool is not suited for grading, but I miss some base consistency there.

Not an FCP user but if you don't use the mac for this type of thing then there are a couple of points to remember that can catch you out.

1. Modern Macs are usually near or are P3 colourspace, iMacs and MBPs for example. Mac OS uses ColorSync to manage profiles but in the default state the Colour LCD is a P3 calibration badly named.

2. If the software you are using has no display management then currently it is pretty pointless on a Mac. PPro will *assume* the screen is sRGB and when it outputs a full RGB Red value then that will show on the P3 screen as a full P3 Red and be super saturated (part of your email). You can probably create or use a ColorSync profile to get around this but i'm sure 99% of people don't.

3. If the software does then you can see this in action. With Resolve make sure your display output knows it's P3. Whether you're doing the timeline conversion in the new 15 or old school outputs you will see the difference immediately. Resolve doesn't do this automagically for you.

4. I have to assume FCP X understands this already but maybe it doesn't hence the DNG looking super saturated. In my limited experience FCP tends to automagic things and give you the best choices most of the time. But workflows can differ so much which can catch you out, for example i don't believe there are any controls for DNG beyond getting the data in and then using the colour panels to modify. Internally AFAIK is it's floating point colour. But you make the point - what colour space is the DNG in. If you look at one in RawViewer you can see the RAW data and metadata and there would be colour matrix info in there - maybe FCP is using this? This is a classic case of testing your workflow to understand what this particular route is doing to your image.. But check out the physical display stuff first - make sure that your image is not being mangled by incorrect display management.

This seems like an odd choice, given that P3 is meant for a reflective screen in a dark environment rather than a transmissive screen in a... "less dark" environment. Any insight into why P3 would be the target for a transmissive display? Seems counterproductive. Or is it more accidental than anything?

Macs are usually near or are P3 colourspace
This seems like an odd choice, given that P3 is meant for a reflective screen in a dark environment rather than a transmissive screen in a... "less dark" environment. Any insight into why P3 would be the target for a transmissive display? Seems counterproductive. Or is it more accidental than anything?

I believe Apple have chosen that colourspace in order to present a brighter, more saturated UI experience. And i suspect it's the primary choices more than any other part of the P3 standard. iPhones, iPads and so on all use extended RGB (which is apples wide colour gamut) and that's basically P3 primaries. Modern iMacs and MBPs all do the same. IMHO i can see the difference in colours compared to other devices quite easily. But you have to be mindful of how you present images and video. iPhones are pretty good at knowing whether an image is tagged for extended RGB or not, in which case the OS will handle colourspace conversion. But on a Mac with editing applications that is a different story. Took me a while to get Resolve looking sane on a new MBP.

Here's an older link but it contains an image that you can test and see what your own display makes of it - you should be able to see the red image within the red image.

I once had a 3rd party swap my phone screen, it was supposed to be for an Apple one but they 'accidentally' used a different 3rd party sRGB screen - the difference was immediately obvious to me, Reds going orange and so on - because the phone still thought it was a P3 display. Needless to say that got swapped pretty quickly...

And i suspect it's the primary choices more than any other part of the P3 standard.

As in, they're only using the primaries but with a slightly different white point (D63 being P3 and D65 being default Apple display), and different gamma (P3 using 2.6 gamma)? What a weird, problematic hybrid.

Definitely reinforces the necessity of grading on a properly calibrated external display over SDI rather than relying on your device's display.

Seems to me the choice of P3 and D65 makes perfect sense given Apple support for HDR and Dolby Vision, anything else would be weird. Does not mean that you cant calibrate to other standards such as Rec.709

------------------This message was sent by Kevin Shaw of Finalcolor Ltd. and may contain confidential and/or privileged information. If you are not the addressee or authorised to receive this for the addressee, you must not use, copy, disclose or take any action based on this message or any information herein. If you have received this message in error, please notify the sender immediately by e-mail and delete this e-mail from your system. It is believed, but not warranted, that this e-mail, including any attachments, is virus free. However, you should take full responsibility for virus checking. Thank you for your cooperation.------------------

Verify Delete

Are you sure you wish to delete this message from the message archives of cml-raw-log-hdr@cml.news? This cannot be undone.

Report Message

Reason

Report to Moderators
I think this message isn't appropriate for our Group. The Group moderators are responsible for maintaining their community and can address these issues.
Report to CML Support
I think this violates the Terms of Service. This includes: harm to minors, violence or threats, harassment or privacy invasion, impersonation or misrepresentation, fraud or phishing.