As much as I love the Mac environment and appreciate having a real UNIX operating system under-pinning, I think it's time for me to make the move to Windows for my editing. While Windows 8.1 is still a sucky Windows-based OS, it appears as though Microsoft is starting to somewhat see the light and they're planning to include OpenSSH in the next version of PowerShell. For those that don't log into machines remotely via the command line, OpenSSH isn't going to matter. But I do, and I require it. This is a good step by MS.

Since the vast majority of my footage is source from my Windows gaming rig, I might just as well use that for my editing as well. I understand pros would suggest not doing that; instead favoring a second machine for editing. I think the advantages of combining the machines (financially, among others!) outweigh disadvantages. So with that, here's my proposal:

My current gaming rig has an Asus X79 motherboard with an overclocked 4930K (4.5GHz) and 2 EVGA SuperClock Titan video cards (700-series). The board and CPU will be replaced with an Asus X99 one, and an 8-core X5960 CPU which I'll OC to (hopefully) 4.5GHz. The Titans will stay; while they're "old news" given their 700-series GPU, they're still a pair of powerhouses and allow me to play my games at 1440p resolution and 144fps. Memory will be replaced as well: bumped to 64GB.

For storage, I'll tap the Asus' on-board M.2 slot for the C: drive, and just load OS, games, and Adobe apps on. A half-TB M.2 drive should be more than sufficient. I have a pair of spinners that are currently striped as D: for my gaming rig to record its game play footage to. Those will stay. The various SSDs that I have in my current Mac's external Thunderbolt 2 enclosures will be transferred to the gaming rig's case and connected to the new MoBo. Several striped volumes will be created for input media, scratch space, and output space. The X99 board has 10 SATA3 slots, so I should have plenty of room.

I'll combine all of that will the plug-in someone developed for h.264 hardware encoding to help speed up my YouTube vid exports.

My current 8-core Mac Pro 6,1 is a reasonably quick machine. But it's limited to a little over 3GHz in full turbo mode. This proposed system will have 16 vcores of 4.5GHz goodness and that should just squish the Mac from a performance perspective. And with that, the Mac Pro is for sale. If you know of anyone interested, send them my way.

if you still want a dedicated editing rig and separate gaming rig, you can look into dual booting 2 installations of windows. the main reason people keep them separate is the software, as it all ends up in the registry and can make a mess of it. so people with an editing rig keep the software installation to minimum required programs. you can also image each windows installation, so if there is a problem with just one install, you can restore the image and be up and running in no time.

you can also image each windows installation, so if there is a problem with just one install, you can restore the image and be up and running in no time.

It actually doesn't take that long to re-install Windows these days, so I'm less concerned about that aspect of it. I can have Windows 8.1 and my games re-installed on my gaming rig in very little time. The longest part of it is the game download.

One thing that did just dawn on me is SLI vs not. I make extensive use of SLI while gaming as it is hugely advantageous to do so. I know folks with multiple NVidia cards are discouraged from using SLI when it comes to Pr and AME. My question: does it actually cause problems? Slow downs? Any real issues? Or is it just not tested enough? I do understand that SLI reduces the overall VRAM to whatever is on one card. Since the Titans both have 6GB of VRAM, that shouldn't be as much of an issue. Are there other things with respect to SLI that might cause problems with the Adobe software?

SLI caused problems with all Premiere Pro releases prior to a certain point (around CS6 vintage???), and then (I think when render outputs started supporting dual GTX cards) that conflict was fixed. Although I have not tested SLI personally, I can't recall anyone complaining about this conflict since CC was released (for years now).

I'm so exceptionally happy with myself: I took a perfectly good, working X79/2011 Asus system and replaced it with what is apparently a DOA X99/2011-v3 Asus system. The motherboard refuses to power up at all, even though it clearly sees the juice from the power supply. Even with the "power switch" pins hard-jumpered, it won't boot at all.

This is certainly a first for me; I've purchased countless Asus boards in the past and never had a single problem with any of them.

well that sucks. asus and gigabyte have lowest failures, but still happens.

New evidence seems to be pointing at the 1000W CoolerMaster power supply, which is strange since it was working just fine with the previous Asus board. And CM makes some very good, nearly invulnerable power supplies.

As mentioned previously, I had the motherboard's power switch pin hard-jumped. That should have forced the board and PS on, but it didn't. I just now popped the ATX connector off the board and jumped the green wire to a ground. That also should have forced the PS on, it just wouldn't have been able to supply power to the Asus. Disks, fans, water cooling... anything directly connected to the PS should have spun up.

Nada. Nyet. Nothing. Deadski.

I'm loathe to believe that I killed the PS by merely swapping the boards out, but I suppose anything's possible. And swapping the supply will be a hell of a lot easier than swapping the board at this point, with all of the water cooling re-attached, etc.

if you still have your old mbd setup, you could test the psu on it, to see if it is dead. if you dont have a spare case, just place the mbd on a cardboard box so it wont short.

almost no vender actually makes their power supply, cooler master, corsair, nzxt, evga, all use oem's to mfg their power supplies. the oem's vary from model to model as well, so some models are good, some not so good. making buying a good psu a bit tricky, cant just go with a certain brand like corsair or cooler master.

if you still have your old mbd setup, you could test the psu on it, to see if it is dead. if you dont have a spare case, just place the mbd on a cardboard box so it wont short.

I have the system (water cooling radiator and all) disassembled on my dining room table (yay for being single!) No matter what I do to the system with that 1000W power supply, it just will not start. However, I have a (much) older 500W power supply that I connected to the motherboard. It was able to spin stuff up appropriately; I think my culprit is the PS.

Ground protection. Ha. I'm apparently an idiot: one of the SATA power leads from the PS had somehow grounded itself. Hopefully it didn't fry the SSDs it's connected to! Since the PS has modular cables, I was able to disconnect it at the PS end. As soon as I did that, the PS spun right up. When I reconnected it, the PS immediately died.

For Future reference you can get Digital PSU testers from retail or etailer outlets for around $20. Those are invaluable for testing PSU's. I would get one if you build your own systems.

See, that's interesting. I've heard about those from a few others, but no one's ever said anything good about them. Basically that their diagnostics are relatively limited; that's why I haven't bothered. But, $20 ain't much, so thanks for the suggestion. Amazon to the rescue (again).

Now to wait for EVGA to get those Hydro Copper 980TI cards shipping... Or just nut up and get a pair of Hydro Copper Titan Xs.

If you look at the manual and use them right they are accurate enough and they find bad PSU's you will not see otherwise. Occasionally they wont pickup a bad PSU if it's a intermittent issue. However most of the time they will if you test the PSU ports the correct way which means testing each set specifically. They are far more accurate than not having 1 at all so I don't understand why people don't use them.

Curiously: the GPU in slot 1 is supposedly running at x8 speed versus x16. The second GPU is running at x16 as it should. The CPU should be able to run both at full speed, so either the motherboard is FUBAR'd, I've got something set improperly, or I've managed to damage one of the GPUs during the work.

I do have an M.2 drive attached to the motherboard, but it's only supposed to compete with PCI slot 5 for bandwidth, which I've left empty. I also have a Sound Blaster Recon3D x1 card in one of the extra full-sized slots, but it's pulling so little bandwidth it shouldn't be affecting anything.

Unless you're doing something like running Octane Render (like me) AND have tons of high poly assets with lots of high rez textures, the 12 gigs of vram on the titan x, isn't going to be that much more advantageous in gaming, and definitely a money waster for Adobe apps.

With the 980ti you'll get 95% of the performance for like 60% of the price. I have a titan x, and it's a beast no doubt, but if I wasn't running octane render I would not have purchased it and got something much less expensive.

Unless you're doing something like running Octane Render (like me) AND have tons of high poly assets with lots of high rez textures, the 12 gigs of vram on the titan x, isn't going to be that much more advantageous in gaming, and definitely a money waster for Adobe apps.

Oh, I'm well aware of all of that. However I'm also thinking of tomorrow's games and what will happen in a couple of weeks when Windows 10 is released. DX12, combined a pair of either of the aforementioned cards in SLI will literally double the available VRAM. So the 980TI's 6G becomes 12, and the Titan X's 12G becomes 24. It makes a boy drool.

It looks like 4.4GHz. I've been experimenting with my new CPU and it appears stability is achieved with a core voltage of 1.3, 100MHz and a 44 multiplier. I kicked around a 100 and 45 along with a 125/36. Neither would run for very long before Windows would go Tango Uniform. So it appears I got a lower end of the run of chips. Oh well.