Sorry for replying insanely late. I was hesitating on the reply but since I discovered that I was just using VSR, I think that the time of commenting this has come.

If I understand monitors correctly, not having a scaler just means that if the input isn't the native resolution of the screen, it'll just nope out the input and display absolutely nothing.

My Cinema display handles only 1600x1024. If Windows ***** itself and goes 640x480, it's impossible to get it back at normal resolution using this monitor. Also, since the IGP is automatically deactivated while a dGPU is connected, it's impossible to reinstall gpu drivers by just ramming something on the vga output of the board (the fury doesn't output any analogue signal; the DVI is DVI-D, not I, and I haven't stumbled upon a DP->VGA converter in my local computer shop yet)

I've ran The Witcher 3 on a 560 in a Thinkstation with a single Xeon E5420 (downclocked C2Q Q9450) and I think that if you overclock your card and disable all the unnecessary stuff, you should be good to go. Also you've got great friends giving you cards :D

Yes it is! The S.C. Magi System-01 (ORIGINAL of course) that gets hacked by Ireul in the series and by the other Magis in the "End". All my computers are named by following this method: desktops are Melchior-[greek letter], Workstations/Servers are Balthasar-[greek letter] and laptops are Casper-[greek letter].

Of course I changed my System panel so it'd have NERV and Magi stuff in it and when I'm running Linux, this is my SSH greeting banner.

one single stick of RAM isn't really a good idea. You could put two sticks in there and enjoy dual channel. Doubled memory bandwith. You could maybe take a 480 if you plan to keep your gpu like really long and enjoy DX12 but if you play "nvidia optimized titles" (like project cars) then stay on the 1060, really good card.

I hope I didn't trigger you by using an old *** apple monitor, running windows due to no games and bad amd driver support on linux (I like myself some debian and gentoo), and by displaying a wallpaper associated with meme music culture.

I'll admit I only put this wallpaper because I thought it looked nice, not due to a particular affection for the whole vaporwave culture.

I'm using an Apple DVI to ADC converter. Since those displays don't have a PSU for themselves but get their current through the ADC cable, this box is basically a power supply that takes the DVI signal, adds current and USB to it (the display got a hub, 1.1 USB I presume), stirs the thing up and sends it through the ADC cable to the monitor. It also seems to have some sort of ROM to remember settings, since some people can change the luminosity of those monitors on one computer, then plug it on another computer and keep the luminosity level.

It's actually in a pull configuration, the airflow is in the same direction for both the CPU fan and the outtake fan. I thought about replacing the outtake fan by one of those Corsair ML120 and remove the CPU fan to fit an air duct by Thermalright so it'd be two birds one stone.

I understand the fear, but on paper it should be OK: a single molex connector is rated for 131W of power, two should be by far enough to supply the 150W required by the norm on the PCIe 8-pin connector.