That's a subjective question I can't answer, though I would say rendering it at 1280x720 with higher details, given its relationship to 1920x1080 is an even 1.5x1.5, would look better than native 1920x1080 at lower details.

Yeah, using render resolution sliders like these in games allows the 3D elements of the game to drawn at a lower resolution and then upscaled by the game (using various methods) while keeping UI elements (like crosshairs, health bars and on screen text) at native resolution rather than running the entire thing at a lower resolution and possibly making said UI elements illegible.

loads of console games do this, it's especially handy there because most TV sets that consoles are plugged into have terrible scaling (either due to artifacts or increased input latency).

COD WW2 does have a temporal upscaler built in (if you set in game antialiasing to SMAA T2X Filmic) which is arguably better than using a lower resolution as you're not throwing away much image quality to get better performance (it's labelled as pre-T2X resolution, it works by rendering only some of the pixels in each frame and then blending the results to create full resolution images, similar to checkerboard rendering although COD's implementation only works on the columns not the lines so it's not strictly speaking checkerboarding) that's how the console versions manage to output 1080p (900p on Xbox one as I understand it) at 60fps without too much trouble in that game