Awesome lessons!
I have a question
How to fix the shadows? I'm talking about 21:02 timing in the lesson about masks.
I understand, why is it happening, but is there any quick way to fix this?
I mean, I want the shadows not to disappear when I make them soft.

All the color charts shot on different RED cameras I've ever seen and worked with, have strange matrix inaccuracy in blue colors, compared to the same color charts shot in the same conditions on Alexa or another cameras, which all match primary and secondary colors on vectorscope very well. No matter, if I use Davinci RCM, ACES or RED Cine-X Pro.
So I think, the only way to accurately match RED to Alexa is to use color charts. You can just print a color chart. As long as you use it just to match one camera to another, it doesn't matter. You can use it manually or use 3D LUT Creator to create 3x3 matrix.I think, you can use a demo version of it and then just to use numbers you get.
Set Timeline Space to Rec709 Scene.
Set right input color space of footage.
Adjust ISO and exposure controls to place 18% gray card at 43,37 IRE (444 at resolve waveform analyzer). Also adjust WB of course.
Change timeline colorspace to LogC AWG.
Grab the stills.
Use demo version of 3D LUT creator's color match tool in Matrix mode to match color charts one to another. You will need to create a custom reference chart from alexa still.
After matching go to Matrix tab and use the numbers in Paul Dore Matrix Plugin.
I'm not sure about LogC for grabbing stills in Resolve. May be you should use linear (I mean gamma 1, not video) gamma Rec709. And assign linear profile to images in 3D LUT Creator before creating a custom chart and matching. Because by default everything is done in sRGB in 3D LUT Creator.
Or you can just visually match the stills using HSL curves in resolve.
If you can't shoot charts, you can find a lot of RAW footage at Cinematography.net. It is hard to find there any. But there are tons of footage with color charts.
As far as I know, this inaccurate RED matrix is similar over all RED cameras. So you can use footage not from exact your camera.
Here is an example of what footage you could use for it. These are downloaded from the website I mentioned above. I tried to find direct link, but no luck.
There a lot of similar shots. Use those, which gray card is closest to 43,37 IRE at ISO 800.

My fault. I mentioned ANSI contrast to exclude measurements of many TV panels where is impossible to turn off auto dimming even in service menu. ON OFF contrast can be about 3000:1 there because of auto dimming full screen black patch.
I was talking about usual contact probes. So neighbor white squares can't affect that much to black squares as it happens with non-contact colourimeters.
So I meant just a usual contrast ratio measurement. Saying about ANSI was just to make sure, we are not talking about auto dimming.

I think, you should try to grade some real project. When you have to grade over 100 shots in a day, your color grading approach changes a lot.
When you learn color grading at one shot, you have time to create a lot of masks, roto, finetune everything. But when you can't spend more then a minute on a shot, you start to look at the whole image balance and how is it interacting with neighbor shots.
So, you just choose a couple of shots of a scene. Fix their WB and exposure problems. Then create a look at this scene's group using these shots.
And then just go through all the shots, quickly adjusting their WB and exposure. And when you did it to all the scenes and shots, if you still have enough time, you can spend it on masks, fine tuning, tracking, etc.
Also full screen and color grading panel changes a lot of things too.
Of course this is just a one way of grading approach, and may be too obvious for you, but hope someone will find this useful.

Are everybody ok with grading using a monitor with 700-800:1 ANSI contrast? This is pretty usual contrast ratio (when set to 100 nit) on so called grading monitors like NEC, BenQ, etc.
I mean, for grading you should have at least 2000:1 ratio. Black at 0,05 nit or lower.
But I found, for some reason I'm more accurate in blacks when I work at monitors with 700-800 contrast ratio compared to 1150-1350:1.
I know about angles, ips, OLED, environment lighting standards, etc. Just curious what you think about working with not 2000:1 ANSI contrast ratio displays. I'm talking about measured data. Not about data taken from specs of course.

I think, there are only two ways for this. I mean, how to skip hiring a colorist.
1. All shooting on planet is in log. Everything is standardized. NLE apps work in linear gamma and some LMS colorspace, for example, the same as each particular camera do. NLE shares just basic controls like Temp, Tint, Exposure, Saturation, Contrast, Highlights Roll-off, etc. Everything is in linear gamma and LMS colorspace, so it's close to RAW settings. An editor just needs to apply some look LUT and then quickly adjust exposure and WB of all clips before the LUT or look preset. Since everything works mathematically correct, it's harder to ruin the footage colors. And no, Lumetri Color not even close to this. Placing Tint and Temp controls AFTER the 'log-to-rec + highlights roll-off LUTs' section - can someone to find a worse order of operations?
2. Neural Networks. ''Select all squares with street signs with cinematic film look. If there are none, click skip.'' 😃
But both options would be the best. AI decides how to adjust those basic parameters, I've mentioned above.
A human colorist is still needed to create look presets for the AI.

It is close to impossible to replicate film LUTs' nonlinear transforms in Resolve.
For example, more saturated blue becomes darker cyan. While less saturated blue becomes lighter magenta. Also brighter saturated colors become less saturated but brighter colors. Low saturated red becomes low saturated orange (closer to skintone), but only if it was brighter than 15 % luminance. More saturated darker reds become magenta.
These are just completely random examples. This isn't what film LUTs do.
This can be done in 3D LUT Creator using AB and CL grids or similar instruments. I mean nonlinear transforms. Not precise film emulation (those LUTs are created from real world film process measurements data)
In resolve you can use node colorspaces and rgb mixer. I prefer AB grid or other tools of third party software for that.
But often you don't need true film emulation. You can create a 'film look' using just Resolve controls. And it will be much more forgiving compared to film emulation LUTs. I mean, noise or similar problems with your footage.

I know that 2.4 monitor gamma for dark environment broadcast, 2.2 for lighter room, etc.
I know what bt 1886 is.
My question is about WHY default resolve colorspace gamma is 2.4, while all rec709 LUTs have 1.90 gamma. Which is darker than 2.4 and 2.2. And this is not strange, because this isn't monitor gamma (which would be brighter) but colorspace gamma. So it is darker.
Resolve has rec709 (Scene) gamma in RCM settings. Which is similar to, for example, ARRI rec709 LUT or all BMD rec709 LUTs. As far as I know these LUTs are 1.90. Not 2.2 or 2.4. And of course not bt1886 which is only for screens. It's gamma depends from black level. It's 2.4 only if black level of a screen is 0.00000000 nits. Otherwise it has lower (brighter) gamma in shadows.

Talking about levels, you definitely should make some tests.
For example when I rendered out MXF OP1A DNxHR 12 bit and VIDEO levels from Resolve, Premiere Pro incorrectly interpreted this so I got washed out picture. When I did the same thing but 10 bit and again set levels to VIDEO, Premiere Pro now interpreted levels correctly.

I guess, density is a printer lights (offset) control applied to log footage.
Not a mathematically correct way to change an exposure. But still close to be similar to changing exposure in RAW settings.
While more correct way is to adjust gain in linear gamma (I mean really 'linear', not 'video', which sometimes incorrectly called 'linear' also), but the offset control is the way many colorists prefer for historical reasons.

Why default Resolve timeline colorspace gamma curve is 'gamma 2.4'?
But all the Resolve Log to Rec709 LUTs, ACES Rec709 and so on are look like 'Rec709 (SCENE)' gamma curve.
What's the point of 'gamma 2.4' curve as a default gamma curve? Well, I know what gamma 2.4 is. But Resolve also has 'gamma 2.2' curve and 'Rec709 (Scene)' curve.
So we have 3 curves for rec709 delivery. And the last one is look like regular, typical and familiar rec709 curve. So why 'gamma 2.4' is the default?
What am I missing?