can someone explain to me a little bit about how to avoid gamma shifts within different OS and players?

I worked on a h264 source project, and exported the final project to DNX-HD 8bit in MOV container in Windows. All players; Windows Media Player, Quicktime, VLC, show different gamma interpretations (windows media player beeing the closest to what Resolve showed me). Could I avoid different gamma interpretations if I would go to a 10bit export? How can I make sure that the client will see what I see, assuming his monitor is calibrated?

This is such an annoying problem that bothered me too. There is so much information but no clarity online... in the end, I found out that I should just leave the video settings on the deliver page to "auto" (not video, or data) and open the exported clip in VLC player to get an idea of what it really looks like.. so this has been good for me so far. The exports seem to closely resemble what I see in Davinci.. which is good.

Its usually the video player or graphics cad setting that causes the gamma shift. I have an nVidia card and I go to the nVidia control panel and change the range from full to limited/video as needed to display the proper gamma of the clip when viewing with a Windows video player.

On the Mac, it is very confusing. The OS version makes a difference. Older OSs expect a gamma of 1.8, the newer ones expect a gamma of 2.2. Quicktime 7 expects g1.8, unless you set the prefs for DCP compatibility in which case it displays at g2.2.

QT10, by measurement, appears to split the difference, and uses a measured gamma of 2.0,

I'm sure this just scratches the surface, in terms of what Mac applications do. CatDV, in it's latest incarnation, appears ok. Other one were unusually dark.