This seems really simple to get around on linux if you just remove the modeline or disable EDID like I already have to do.

Unfortunately its not that easy, I always had EDID disabled, now the driver simply wants to screw with my refresh rates everytime something 3D starts up, even if its setting both monitors to 100HZ when the other one doesn't support it....

I can go into nvidia-settings and force the second monitor back to 68HZ after starting minecraft and not run into any problems, so the drivers do still support independent refresh rates per screen....

So NVIDIA, how do I get the new 300 series driver to not 'conveniently' manage my refresh rates when it feels like it?

This is what I actually have right now, but when I start minecraft or anything else OpenGL, it switches the second display to 100.4hz, is there a way to actually remove the 100.4hz setting from DVI-I-3? That would solve this I think....

Adding modes through RandR is not supported yet, but you should still be able to add them through xorg.conf. The driver still provides a lot of detail about where various modes come from if you start X with the -logverbose 6 option or turn on the ModeDebug option. Check /var/log/Xorg.0.log to see where your unwanted modes are coming from, and then you should be able to use the various ModeValidation options to turn off implicit modes, such as NoXServerModes, NoVesaModes, NoPredefinedModes, and NoEdidModes.

Adding modes through RandR is not supported yet, but you should still be able to add them through xorg.conf. The driver still provides a lot of detail about where various modes come from if you start X with the -logverbose 6 option or turn on the ModeDebug option. Check /var/log/Xorg.0.log to see where your unwanted modes are coming from, and then you should be able to use the various ModeValidation options to turn off implicit modes, such as NoXServerModes, NoVesaModes, NoPredefinedModes, and NoEdidModes.

Thank you for the reply. That makes sense if its not yet supported.

The idea is I want 2 different refresh rates for each head, which I can do just fine through xorg.conf, except RANDR then also advertises these 2 specific modes for both my monitors, which I do not want, I need to force each monitor to a specific refresh.

Minecraft for example somehow sets the mode via RANDR which causes my secondary monitor to get set to the same refresh rate as my first monitor. If I was able to remove the incorrect refresh rate from the secondary monitor in the RANDR list that would be okay, but its not possible.

It would be possible if we were able to set custom modes through RANDR as then we can force specific modelines to a specific screen and no longer use xorg.conf to set specific modes, or of course the other option is to disable RANDR entirely until this is supported. (Games can no longer change the modes, which is good as I don't run stuff in fullscreen anyhow)

Adding modes through RandR is not supported yet, but you should still be able to add them through xorg.conf. The driver still provides a lot of detail about where various modes come from if you start X with the -logverbose 6 option or turn on the ModeDebug option. Check /var/log/Xorg.0.log to see where your unwanted modes are coming from, and then you should be able to use the various ModeValidation options to turn off implicit modes, such as NoXServerModes, NoVesaModes, NoPredefinedModes, and NoEdidModes.

Care to comment on > 400 Mhz pixelclock on geforce 600 series hardware?

I have this Catleap monitor as well, ran fine at 2560x1440 103Hz on GTX 460. I purchased the 680 purely to get 120Hz, only to hit the wall because of the driver? It is well established that it can be made to perform 120+Hz on Windows.