Fusion's Creator type tools are linear, with sRGB primaries. Just make sure your Frame Format settings in the comp preferences are set to at least 16-bit float (the default is 8-bit linear). I don't recall off the top of my head what the options are in the ACES OCIO config, but you definitely do not want to use sRGB as your source space.

The ACES workflow demands that all photography be converted to ACEScg prior to compositing. Output from renders should be in ACEScg to start with (although not all renderers support ACES directly, you can 'fake it' by preprocessing textures and lighting against the ACEScg plate). Once in Fusion, everything is just math, which is indifferent to your actual color space, so long as it's all linear. If you have ACES coming in, you'll have ACES going out.

You should have a buffer LUT converting ACEScg to your monitor's color space.

This is ver. 9 For Windows, I think I wrote the path to the file in the System environment. After that there was a huge selection from the list of Color space.

It really is Liniar sRGB! Thank you, @Midgardsormr now there is no loss above 1.

P.S. In Nuke, there is a color space setting for the current script, and it is not necessary to assign Color space to each loader or generator. He does it automatically and it is very convenient. Besides, all tools that work with color (generators, gradients, Color Wheel) are adapted. In Fusion, we will not wait for this, it seems.

While Nuke's method is convenient, it comes with the danger of training compositors who aren't even aware of color space issues. When you inevitably run into something weird, like a client we're working with that likes to send us log EXR occasionally, Nuke will make an incorrect assumption, and the artist may not understand enough to be able to fix it. I honestly didn't have a clue about handling color conversions until I started working in Fusion, where you have to address it explicitly (excepting the horrid tendency to auto-convert DPX—that tab must die!)

If user is not aware of what he is doing, defaulting to sRGB vs some other colorspace will not save him one bit. With said example of log exr, how is user going to get it working without manually changing the input colorspace to what it should be? He has to do it anyway, whatever the defaults are. Defaults are there to reduce number of dumb clicks, not increase them. If one wants to train compers who are not colorspace aware he should use AE

Regarding generators, they are not adapted in Nuke, generators are not colorspace aware one bit. They push numbers and the meaning of these numbers comes from what the working space is.

I very much agree with Midgardsormr et al. I've seen quite a few disasters caused by compositors not realizing what nuke was doing 'out of vision' in terms of colour space. And I've spent ages tracking down hidden colour operations when they caused problems or I didn't want them to happen. I'm much happier with all the operations visible - it may be a bit more work to learn but it means you actually know what's going on.

Maybe I'm daft but I can't see how knobs in the main panel of Read node are hidden? What defaults do in ACES projects is they set the default colorspace of Read node, it is in plain sight, user can override it at will, there are no hidden color ops happening nowhere unless you take conversion from input to project working space as hidden. But user can manually manage everything if he pleases, set all default spaces to the same space and explicitly do all transforms, including view transform.

I think the philosophy JPDoc refers to, and that I push, is having a discrete node in the graph that handles the color transform. You don't even have to click on the Read/Loader to see that someone has thought about the color space and done something about it. If someone has done something weird, like put a Gamut after a Loader holding an EXR, you can see it immediately and wonder "what the heck is that about?" Whereas if they've changed something in the Loader, it's not immediately obvious, and you might not learn that there's an issue until the client issues a note that the color space is wrong.

It's not a matter of controls being hidden from someone who wants to use them, but about conversions being hidden from someone who needs to know about them.

It's not a matter of controls being hidden from someone who wants to use them, but about conversions being hidden from someone who needs to know about them.

Pretty much the same when working with imported 3D Geometry.
Rather put a XF3D after your RootNode to scale the thing up or down than doing the scale in the RootNode itself.
Re-Import the 3D scene, and nobody will ever know that some TRS was applied afterwards...

Mhm, I understand the idea, it is a matter of practice I think, and how things are done in facility. Explicit transforms are nice and clear but imho just a bit overkill unless scripts are likely to move from hand to hand and there are some funky colorspace shuffles going on.

I'm primarily a Nuke user myself but try to understand the logic of Fu and Fu users to widen my perspective so to speak

Well, I can't speak for anyone else's experience, but here at Muse it's not uncommon for two or three artists to all touch a given comp. And we work in TV, so funky colorspace shuffles are more common than not!

In any case, the whole point of a nodal system is to create visual feedback about what's going on. The more operations you bundle into a single node, the less readable the comp is. I say down with transform controls in the Merge node, too! I can't even count the number of times that's baffled me, even when trying to troubleshoot my own comps.