The machine is used for this because I'm a TV/Digital Media Major, and have very demanding projects which are shot/edited/produced in high definition. Because I don't live on campus, I can't use the editing labs and hardware, so I own most of the gear I need for getting work done, building this computer put the finishing touches on this, and allow me to work from home completely.

The build was challenging in more ways than one. I did not want a massive case, I did not want something flashy or ugly, and I wanted something as small as I could get it. I most certainly did just that.

For people familiar with the overall gear, the SD1283 is 159mm tall and 50mm wide. The Asus Z9PA has LGA2011 sockets which have a mounting hole size of 80mm. This meant that I had to make sure I chose heatsinks which were shallow enough to fit a fan on each one, as I expect the heat output to require a significant amount of cooling. I chose the fans I did for the smaller hub, quiet operation, and the looks. Despite not having a side window (wouldn't fit anyway) I wanted the inside to look good. The case just fits the SD1283 heatsinks. The available height for coolers is probably 160mm total from the socket mount to the inside edge of the side panel. The biggest issue that I found when doing the install was with the 10" wide motherboard. It extended over a raised part of the motherboard tray which I did not trust. I placed a strip of thick gaffers tape over that area to add an insulating layer to prevent any shorts from the components and/or component leads sticking through the board. The machine booted right up once I pushed the power button the first time, so I must be doing something right.

Cable routing in this case proved to be easier than I expected. Behind the motherboard tray is more than adequate space for hiding cables. The Antec HCG is not a modular power supply, and the cables are long, and worst of all, bundled terribly. The ATX power cable is quite large, and difficult to squeeze places, the 6 pin PCI-e connectors are extremely stiff. There are a large number of cables in general that I didn't need, and had this been even a partially modular power supply, it would have been easier to work with. While I do not want to switch out the power supply having routed the cables, the idea of switching to my Antec True Power New 750 is tempting.

Mounting the fans proved to be somewhat complicated, I used the soft mount system from the Noiseblockers on the two case fans I left in, however since I only decided to do this once I was nearly done with the build, there wasn't much space to work with. That aside, while I'd rather use normal soft mounts, this bolt through kit seems to do the job fine. I do have a nexus soft-mounting kit for fans, so I do think I'll be changing to that soon, I simply didn't have it available while I was doing the build.

The configuration is definitely not set in stone however. The upper 140mm fan is the loudest of the bunch, and though it's quieter than most fans, I can hear it when the room is quiet, and I don't like that. I have sealed off several of the side, bottom and upper vents to promote front to back airflow, I would definitely say this machine could be quieter at idle. I'm tempted to remove the upper 140mm fan and block off the vent. I also have plans to change the PCI slot covers from the vented ones I have installed (different from stock) to solid ones.

Changing the HR-05 to an HR-05 IFX will allow for a dual slot GPU, or at least a better cooler to be installed, which is a must. I had planned on using my GTX260, as it was moderately quiet for normal use, but I didn't realize how little clearance I was dealing with until I could fit everything together. There is more space between the Xigmatek coolers than I had accounted for, so that's great news.

Some of the more nitty gritty on component choices:

I had gone to none other than pm.stacker to ask about his opinion on parts, and to say he knows dual socket boards is an understatement. Although he did point me towards the Asus Z9PA-D8C, no vendors had it in stock or could ship it to me in any reasonable amount of time. Considering a week from now I will have already started classes, I needed to get the machine running to work out the kinks. I'm already bumping into some software problems, but I'm working through just about all of them. The USB3.0 card was chosen for the front panel header it had. Since the Asus motherboard only has two rear USB3.0 ports, and no headers for it, I wanted to connect the front panel USB3.0 to a USB3.0 controller. If it's there, I want to be able to use it (if I need it of course). The front panel audio is the exception, as I don't need it. I have a ProFire 2626, and it sits just inches away from my keyboard, so I don't need to reach under the desk for audio plugs, my main audio outs are all on the ProFire front panel. The Firewire card worked OOTB, as did the USB3.0 card. I didn't even put the driver CDs in, it just worked right away. I'll bet testing transfer speeds soon to make sure it works as I need it to.

The case ended up being spectacular. For a $49.99 case, that's not common. To put everything on the table, I love the matte finish of the case, however it does scratch and chip easily. Taking out screws that held in the PCI slot covers showed me that, as did taking off the side panel thumbscrews for the first time. I've covered that up with a simple paper washer, but knowing that other NZXT cases ship with rubber washers on the thumbscrews (NZXT Phantom, among others), I was disappointed to see this didn't. That aside, the power button doesn't make any clicking noise, it's just a very soft touch and the machine fires up. Same with the reset button. For users who have this case around things that move, or in plain sight where it can get bumped, this can been a problem, but I love it. The dim power LED and HDD LED is absolutely wonderful. I also really love the diagonals on the front panel. While it's priced much lower than other cases, aside from the finish, it is designed well and I feel it competes with many other ATX cases. The cable routing possibilities were perfect for what I needed, and allowed me to set everything up very easily. I only extended one power adapter for the USB3.0 and FW cards, so that I could run the wire out of obvious sight. My only beef with this case is that the two USB ports on the front panel use two separate cables. The USB2.0 port could easily run with the USB3.0 cable, rather than going separately, but it is what it is. Maybe eventually I'll take off the front panel and see if I can do that mod myself. The case grills are a little tight, and I have every intention of cutting out the back fan grill, I have a wire fan grill that I bought with the fans just for that purpose as well, I just need to make sure everything works well, then i'll put it under the knife a little more.

I have already added the 640gb from my previous build, as it holds my main data, and I will be adding a second internal 640gb drive just for school work. The reason for this is because I do not have a working Apple HFS+ read/write driver anymore. In my previous build it installed with Pro Tools, and I was able to work off the external hard drive I used with my MacBook Pro, however with this latest build, the software does not install correctly, and I am only left with the Apple Boot Camp Read Only HFS+ driver for Windows. While I'm sure many will criticize my early adoption of Windows 8 in this build, I feel that it works well enough for what I need to do. Nearly all of my essential software is working correctly, minus only Paragon Partition Manager which I use for work, however I have yet to try installing a different Paragon application we purchased several months ago. Adobe CS6 Student license installed without a single problem, so I have no reason to doubt how well Windows 8 will work for me. Many thanks to the Microsoft support rep who also helped me get it activated on this machine, I'll just leave it at that

Very nice to hear something about the NZXT Source 210 Elite. It definitely looks like a great case for it's price. I don't think many people know about the case compared to similar cases like, let's say, the Corsair Carbide 200R and the Fractal Design Core 3000.

The perforated PCI slot covers really affected the airflow, and the top 140mm fan was creating all of my unwanted noise.

I replaced the slot covers with some temporary ones to alleviate my airflow problems, which now cause the GPU airflow to rise to the rear exhaust fan, where the Slip Stream really just kicks it out.

The top 140mm fan when positioned horizontally is definitely out of balance, and makes a very noticeable rattling sound outside of the case, just on my workbench. Good riddance.

I'm waiting for a Firepro V4800 to put in as a temporary workstation card to hopefully reduce temps compared to my old 9600GT. It's no slouch, but the card is very hot with the stock single slot cooler, as I have to wait a few days for a new Thermalright HR-05 IFX cooler in order to give me room for a 2 slot GPU. The two slot GPU will also bring a close fan to the IFX cooler to keep that all cool as well. Temps are stable, and the machine is so far, rock solid.

Because of the hardware acceleration I need with Adobe Premiere Pro, I'm going to go for the Gigabyte GTX 660ti 2GB, a good price, CUDA compatible, and plenty of RAM for the Mercury Engine to utilize. The bonus for me is the larger, better than reference cooler. The offset on this is that the stock cooler would send heat directly out the back of the computer, however that extra heat that I would be dealing with is likely not going to be welcome.

Certainly is a beast for computing power. No SSD? Doesn't the rotating media impact your editing performance?
I was wandering through Adobe forums earlier. Doesn't look like Adobe has bothered to update their registered list of "certified" GPUs for Premiere Pro beyond the high end cards. You might need to do a small hack to a cuda text file...lost the link for this...

I'll say that it hasn't been a walk in the park, however I've worked through my problems.

Paragon Partition Manager 10.0 is incompatible. No workaround, however the shop I work at has already purchased upgraded software which does work. Paragon Hard Disk Manager is compatible with Windows 8, contains all the features of Partition Manager as well as some new features which will be used in the future.

One problem is partially fixed, which I outlined before, which was the lack of HFS+ support. Since my laptop is a MacBook Pro, I do need to have some form of cross platform support. I have the current Boot Camp drivers which are Read Only. Could be worse, but the speed is fine. I'm still working on sorting through all the drivers and making sure that I have the best version installed possible, but with certain workarounds, the speed of certain things will be lower than what it could be.

I'm going to be tweaking the Premiere Pro install to allow GPU Acceleration for the 660ti I have on the way tomorrow, and that will just be that. I've pretty much exhausted my entire bag of tricks trying to get the FirePro V4800 to work with OpenCL acceleration, so the CUDA technology wins only by default. The CUDA technology, as it was explained to me, was a finished standard when the Mercury Playback Engine was developed. Because of that, OpenCL support is very behind on Windows, and only the nVidia CUDA platform is compatible.

I don't find the drives to impact editing performance right now, I have already completed one project by doing the editing in Premiere Pro and Menu system in Encore, and having seen how fast the machine works on only the software acceleration, I'm sure once I add the CUDA GPU for hardware acceleration it's going to be significantly faster.

As far as SSDs go, I'm unsure of it. I have a couple of dead small SSDs which are mSATA, however I know that they have come a long way and have improved a lot. Maybe that will be my upgrade after doubling the RAM. It's going to be June by the time I make the move to an SSD. I also have to say that I'm not sure that an SSD will make much of any difference for what I'm going to be doing, simply considering that the scratch disks and long term storage are spinning disks. If every drive was an SSD in this machine, the speed difference would likely much greater, but I don't want to trust all of my data to an SSD yet.

I'm looking to go back to adding a 140mm fan, as there is very significant heat output, and the system will run much cooler with it. Does anyone have thoughts or experience with the Xigmatek AOS XAF-F1452? I'm looking to keep all fans PWM in this build, so the motherboard can do Thermal management of fan speeds. The Scythe Slip Stream does seem to be problematic, not accurately reporting RPM to the motherboard, I'll have to do some more looking into that.

Some mods that I took care of today aside from the grill cutout was using OEM SATA cables which were much shorter, so I don't have to exercise cablegami too much on this build, and it can be somewhat self contained. One larger change I will be making slowly is that when installing the new GPU and HR-05 IFX, I'm going to be swapping the power supply for my True Power New, as I want to add a couple more coats of paint to the High Current Gamer (red metal bracket has been painted matte white) and I also want to re-sleeve the cables, partly for looks, partly for organization. The stock sleeving is decent, but very stiff and hard to work with. I'm going to be swapping out some connectors for 90 degree connectors in one direction or another to provide better compatibility with my install. Any thoughts on sleeving ideas are more than welcome, I'm thinking of a black and white scheme, but I'm not sure what exactly I want it to look like yet. The new sleeving will be from end to end in an effort to make it look neater (I also want to sort out some of the cables all the way back to the PSU, so they take up less space).

Adobe always had a "sweet spot" for Nvidia even before OpenCL/AMD GPGPU appeared on the market. I remember in 2009 struggling with CS2 on a Radeon HD4350 and getting 2 fps on 2nd monitor preview (and it's a dual E5520!). Following online advice, I swapped to a $50 Geforce GT220 with 2048MB onboard, it was like from night to day, now I had realtime preview, in software mode. So when I had some money I decided to swap to a Quadro, I first found a used FX3800 for dirt cheap but it was a OEM Dell card with 1½ slot heatsink and a hurricane fan, then I found a OEM new 4000 (underpowered GF100) for $800 in 2010... it has "just" 2048MB but with CS4 and now CS6 it performs wonderfully. Only downside, the Nvidia video BIOS has some problems handling two Displayport monitors both with audio termination (HP with SPDIF out and Eizo with internal DAC), and the primary display should always be connected to the port marked as "1" on the bracket (on a FX3800/2000/4000, that's DVI-D).

I'm having excellent reliability and stability results with a 128GB Corsair Performance Pro, Marvell 88SS9174-BKK2, on my Ubuntu64 fanless DLNA server. It may not be the latest or fastest but it has auto garbage collection (Trim is not mandatory) and does not suffer uncompressible data like LSI-Sandforce controllers.

GTX660Ti is installed and running with the Mercury Playback Engine. All is well until you open Photoshop, and see there's no OpenCL support (what a PITA). Since that's a driver issue, I'm leaving things as they are for now. Both of my monitors are DVI only, so for a while, I wont have issues with the Displayport, however by the time I upgrade these monitors, I'm sure I'll be able to move to an ATI Firepro series. I'm using a pair of Acer S211 HL-BD 21.5" LED monitors which support 1080p resolution. I have an identical monitor setup at work on the boss's desk with an Atom based ITX machine we use for emails, music playback and general paperwork with the Office Suite.

The HR-05 IFX proved to be very difficult to install, but is now installed. If you look on the flickr set, you'll notice that there is a mounting issue, as under the GTX660, one of the mounting screws makes contact with the heatsink. Can't be perfect right? Stock cooling may have avoided this, but because of the heat I anticipate this thing dealing with, I'm leaving it. It's fine for now. It took long enough to reach this point where the entire machine was set up, and it lets me relax knowing I wont have to deal with the absolutely poorly maintained Mac Pro workstations on campus (dual CPU, 8gb RAM, and usually filled to the max hard drives with student projects).

The cables of the True Power New are too short to reach, compared to the HCG-750, so I'll reserve the new cable sleeving to be a summer project, as I don't expect to end up using this machine nearly as much then. Anyone with ideas, feel free to put them out. I took advantage of the failed swap to re-route the PSU cables more neatly so the side panel doesn't have to be wrestled back on.

The last things to set up are a small Ethernet switch under my desk for the Avid Artist Mix units, I can directly connect 1, but I need access for both, so a switch is the only way. Ethernet0 is being used for the Artist Mixes, and Ethernet1 is being used for my internet connectivity. Reduces the sloppiness of connecting the Mixes to my whole network. The switch will likely end up being powered by the computer itself, so I'll probably look at a Linksys that uses a 5v power adapter for that mess.

eSATA is the last thing I'll end up setting up, when I find an eSATA bracket that has a long enough cable for some hidden routing, or if I find a nice front panel card reader with eSATA and full support. My last card reader lasted about 5 days before crapping out with very little usage, so as long as my USB card reader keeps working, I'm fine with it. Maybe I'll just move to a USB3.0 reader.

Funny you mention Ubuntu, I will be installing it on this machine, however not any time in the near future. I'm going to set up a dedicated HDD for that, and just select that drive via the BIOS boot selection, just to avoid any boot loader issues. My previous machine ran that way, and it worked out great. Maybe when I move to the SSD, I'll stick in my 80gb for Ubuntu, and use a new one for Win8.

And for the record, I'm still pretty satisfied with Windows 8. All of my hardware works with it, and considering this is a slightly more advanced build than most, that compatibility does speak worlds for how well it works.

I know the HR-05 fixing system can be a PITA if you have the chipset near a X16 slot, I was in a worst case scenario since my Tyan has 2 chipsets directly behind the X16 slots and both with one hole directly in line of a X16. But since here they placed the mounting holes worlds apart (something like 3"... ) and thus having to build a retention bracket myself, I built a lowest profile possible using countersunk screws and a piece of 1/10" thick alu. You can adapt the standard TR mounting hardware too: they used metric M3 thread, then you can ditch the long thumbscrew in favor of a standard M3 nut, and/or use a shorter M3 bolt.
On a side note looking at the flickr images, that's a HR-55-SLI-IFX heatsink, way beefier and bigger than the HR-05-SLI-IFX (offset version of the HR-05-IFX you used first place)

Other side note: all Nvidia Quadro FXx800, Fermi based and newer are DP++ compatible, i.e. their DP connectors are automatically configurable to output DP/HDMI/DVI/VGA via passive adapters. I connected my Onkyo amp to DP2 (connector "3") via a DP-HDMI mechanical adapter and it works 1080p with HD audio. So you can connect your Acer displays via simple DP-DVI passive cables, AFAIK the latest two Radeon-GL generations (Firepro) should be DP++ compatible, I remember the Radeon Eyefinity-6 versions with 2xDVI+2xMiniDP connectors were not compatible, since the MDP ports supported replicators to get 6-display capability and they ditched DP++ for that, don't know if the next E-6 version with native 6xMiniDP restored the "multi-mode" functionality.

About the HR-55, yes, I got mixed up, when I placed the order, I ordered an HR-05, got this instead. I'm not about to complain. For now the system is running very well. In the future I have plans to change the HR-55 IFX mounting entirely, using shorter mounting clips to avoid touching the GPU heatsink. Since the motherboard does not have mounting holes for the chipset, it will take some thinking about available parts to mount the HR-55 to the two loops that are on the motherboard.

In other news, I added an external power connector for a small Linksys 5 port Ethernet switch for my Avid Artist Mix units. Since I have two, and only two ethernet ports I can use, I have one for my LAN, and one for a separate LAN just for the Mixes. The Linksys switch says that it "requires" a 7.5v power adapter, however it worked fine with a 3.7v power adapter I had around, so I hacked together a 5v molex adapter for it, so it is powered by the computer. This removes any need for a different power strip. I use this same concept to power HDD Docks at my shop, and it works very well. Reduces wall warts, and also makes cable management easier. It also doubles as insurance for the battery backups, I don't have to question whether the docks are plugged into a battery port. The switch will always turn on with the computer, and that's just that. I had thought about trying to mount the switch to a PCI bracket, however that proved to be entirely too complex. I also thought about setting up the switch inside the computer, however if I have to move the system around, I don't want to have to worry about the plugs being inside for that.

I also ordered a 140mm Phantek for the top fan, and once it arrives, I'm going to set the BIOS back to "Generic" fan mode, which is absolutely the quietest. The top fan should take care of all the extra heat that builds up in the case, and temps should stay about the same as they are now, with the fans in "Full Speed" mode, with no top fan. Speedfan does not read any sensors of the board correctly, but CPU temps currently are low 30's idle, under load it hits mid-upper 40's C. For the power and the heat it puts out, I'm very happy with how it worked out.

I'm very satisfied with the performance of this machine, and once the top fan gets installed, I will be waiting until late feb. to upgrade to 32gb of RAM, and that will be it for some time before I make any changes to the system.

forgot about wire loops... all latest mobos I have all have at least 2 holes for screws or pushpins. anyway there's always a workaround
I debated on getting two HR-55 too but they were too high for my setup, since I wouldn't have been able to mount the Spitfire upside down in front of one of them, and being offset they'd have rendered the PCI slot unusable.
Good work on that beast I have to finish rebuilding my theater instead...

The setup is overkill right now, however the very first project is only :30 seconds, but the final projects reach 30 minutes. I'm the only student who works in full HD from beginning to end, and since I am a commuter student, this machine has to be overkill to a point. Between driving back and forth and having time to get all the media transferred, organized, and edited, I can't wait for things very long anymore. The render times have to be quick so I can get rough cuts done, and passed out to group members so it can be approved. I'm up against 2007 Mac Pro machines with 8gb of RAM, and usually filled to the brim hard drives. They have extremely slow render times because they are not set up with a render farm, and students really don't take care of them. Simply put, I'd rather have this overkill machine for the next 5-7 years (at least) than suffer from having to use the machines at school. Even my laptop (i7 8gb RAM, etc) I really don't have enough power, and I definitely do not want to rely on the machine long term for all of my projects.

It's more than just a "have it for now" machine, it's meant to last me a long time. Not future proofed, but long lasting power. This machine will be seeing hours and hours of uptime this summer doing local spots and short videos several times a week, so there's no more waiting for rendering to finish, and that's the most important factor for me.

I hear you. My T5400 was originally a single CPU machine, with 4GB RAM, 160Gb boot drive and 500GB data. It's nearly 5 years old, and had extra RAM, another CPU (about £400 for the kit!), and the drives replaced. Otherwise the machine is perfect - it runs quickly, rendering is reasonable.

IF it was my personal machine I'd have a SSD drive, more RAM and maybe a more modern graphics card, and probably get another 2-3 years out of it. But, it's a work machine, and it has to go as it's 5 years old and there's no longer a support contract on it.

If I had known about the program I am now enrolled in, I probably would have built this machine much sooner. I would have gone through a similar upgrade process as you. But as with most other situations, when I built my old computer, it wasn't used for video editing, and I didn't actually see myself going that route. With this machine, I should be set for some time, and in the future, I would likely replace the CPUs rather than the whole machine. The LGA2011 socket would allow me to upgrade up to 8 core CPUs in the future (and by the time I needed that, I'd likely be able to get them pretty cheap). More RAM is on the to-do list for me at the end of this month, and maybe an SSD over the summer. Beyond that, I see this machine remaining the same for some time, and only replacing the GPU as the major upgrade in the future.

eheheheh I hear you but remember that Xeon series CPUs usually don't go down in price with age, they simply disappear from the market, seen with my E5520s that by the time they were dropped in favor of the new Westmere E5620 they were only 5% cheaper than when I bought them (that's 18 months time). This is talking about new; finding used Xeons is difficult (at least here in Europe).

I'm currently working on a project that's completely managed, produced and edited by me, because it started as a school project and just became a whole lot bigger than I anticipated.

I'm doing a documentary (full length, 60-90 minutes) about a local charity and their efforts throughout what is considered "their time of the year," because after doing a short project and meeting them and working with them so much, it has become a whole different thing than I expected.

As the project develops, I may post some pieces of it so people can see what I'm up to, but for now, I'll just leave this one frame grab to drive the point home..

After nearly 3.5 years, the machine has finally received two of it's three planned (but unplanned) upgrades.

Obviously, the 2.0GHz Xeons are not keeping up. 12 physical cores is a lot, but the workflow is becoming more demanding, and honestly, 4k is here to stay. Several things have been hammered out, and the machine is everything I had quite specifically planned it out to be, except I never would have expected to get here quite so soon.

I've worked in the computer industry for nearly as long as I've been a member on here, so it was inevitable that this machine would be about as future proofed as any computer could be. What I didn't expect was to work in server repair today, which puts me in a unique position to be capable of upgrading my machine without throwing boatloads of money at it (merely life rafts of money). There's no forgetting that I am running server hardware. Asus makes specific Workstation version motherboards of the Z9PA series (a quite expansive series of motherboards). The E5-2620 CPUs are not particularly "high end," they're not slow, but they are not keeping up with the workflow of video/audio editing at 1080p and higher resolution for prolonged video clips. Under 5 minutes was relatively fast for exporting, over 15 minutes and at 4k resolutions was certainly not where I would like to be for render times. Time is money, especially for an editor, so it's important to keep up with the best I can get my hands on.

Enter the full time job as a Datacenter Support Technician, and I'm working on machines day in and day out that are high performance servers, from VMWare hosts to Exchange hosts, these servers need a 24/7 uptime with no problems. Problems do occur, but they usually aren't because of hardware.

Why does this work out for me? Decommissioned machines get parted out due to either failed hardware or they get put into a store room in case they're needed in the future. When inventory sits for too long, it goes back to a central office to sit on a shelf before being completely removed from service. Due to a few current upgrades that are occurring, I've got a number of machines that have hardware I was interested in, and was able to purchase at a huge discount, simply due to the unknown nature of whether they still worked. Since it's my day job, going back and saying hardware doesn't work is simple.

How do you upgrade a machine that's already running a pair of 6 core Xeons with 32gb of RAM? You max out the CPUs and toss in as much RAM as you can get your hands on. I wasn't even planning to upgrade the machine as far as I did, but I wasn't going to miss the opportunity to save a lot of money on the biggest upgrades I can go to. So where does my workstation stand currently?

The Antec HGC-750 power supply is still in the machine. I don't use it every day, only when I'm working on a project, so nothing is unnecessarily just burning hours on drives. That alone saves a lot of money.

The Xigmatek Gaia heatsinks are still being used, they offer great heat dissipation, and with PWM fans, when things warm up, I can rest assured that it will keep itself cool.

The RAM has been upgraded to an absolutely obscene 128gb. Not all that long ago, users going from 2gb to 8gb were floored by the increase in performance. Today, 4-8gb is standard in nearly any system you look at, and for the past few years, even most laptops have been able to handle 16gb. Do I expressly need 128gb? No, I initially expected to upgrade to 64gb, and only upgrade beyond that if 4k footage ate through it all. I ran a couple of tests using an old 1080i project that was expressly 27:20, and found that actually going to 128gb wasn't a bad idea after all. At first if you watch task manager like a true hardware junkie, you'll see that it slowly makes it's way up to about 20gb of RAM, but as the render process includes more effects and footage that was being re-colored, the RAM usage more than doubled. I saw the machine peak at about 58gb of RAM being used. The time saved is incredible, knowing that the machine is completely capable of utilizing that much memory, and just re-affirms that I'm glad I went with the Adobe Master Collection when I was in college. The student discount let me afford it, and the software takes full advantage of the hardware I'm running.

The CPUs have been pulled out, and a pair of E5-2670v2 CPUs have been dropped in. I would have loved to hold out for a couple of V3 CPUs, but my motherboard only supports up to V2 CPUs. Normally, you would notice that Xeons really don't drop in price until their end of life is announced, and then even through wholesalers, they drop in price like boulders. I recently had to replace an 8 core Xeon in a VM Host, and the CPU cost $75, including next day air shipping. While the E5-2670v2s still retail for north of $1200, these came out of a failed system, and there was a lot of questions about whether the CPUs were still good, simply based on the severity of the failure (you do not want to have to be on call when a datacenter loses power, we'll say that much). Being out of warranty, like a good deal of other hardware, it was a gamble for me to install these CPUs. For the price, if it didn't work out, I could replace my motherboard without much trouble. If it did work out, well, the 2670v2 is a 10 core CPU with hyperthreading. Two of those together are nothing to pass up. My gamble payed off. There was concern that my machine would either die or just not boot with them, but I dropped them in, and within 30 minutes I had the machine back up and running. Admittedly, because of how tightly the machine is packed, it took longer to get the coolers bolted back down than anything else. The machine is tightly packed, and in a relatively small case, but no worries, it works like a charm.

The GPU is a GTX660Ti 2Gb card. I intend on taking my overall budget of $1200 to upgrade the machine, and waiting for a price drop before deciding if I'm going to upgrade to a GTX 1070 or a GTX 1080. I just need to confirm that I can tweak the Adobe Mercury Engine config file to add support for the card, and then I'll be happy as a clam for another 3.5, or possible more years with this machine.

Would it be possible for me to upgrade this machine again? Sure, but not easily. I'd essentially be swapping out the motherboard if I wanted to move to V3 or V4 CPUs in the future, and for all of that trouble, I would likely build another machine and move over to Avid software, so I can take advantage of sending remote render jobs and splitting render duty between the two machines.

Short of the labels on the RAM, there's no physical changes that you can see in the machine yet, and there hasn't been any major changes, just added a hard drive, so no updated pictures until I go for the GPU upgrade, but that may not be until later this year. The machine sucks in very little dust (it's also not on for days at a time, which helps significantly), so I'm at a point where I may not change a thing on this box. The old hardware will likely get rebuilt into another machine for splitting render duties (work on one project, have the old Xeons render a project in the meantime), but it's a good feeling knowing that a 3 years of patience paid off, and despite taking a big chance, the machine has new life to it. I'll be doing some 4k filming this weekend, and creating new projects as possible during the week to see how well it handles the footage. The GPU upgrade is a very important piece to this puzzle, but I want the launch edition hardware to finish up, and see some more selection from other brands so I can make sure that the new GPU can be as quiet as the current one.

It's also a great litmus test for how much wattage a computer really does need. My dual 10 core Xeons run great with 750W. Sure, stacking graphics cards for SLI or Crossfire will require high wattage to keep them running, but 1000W power supplies are certainly not necessary for most builds.

I'm curious on the economics of CPU $'s vs GPU $ on render speed......hasn't most of the video editing s/w moved over to GPU-accelerated rendering? Granted, if Xeons are dropping into your lap, you can't beat free...

Definitely an interesting topic, and its not really discussed well enough by software companies.

Ill give my answer based on my current setup and how I understand certain things to work today as far as rendering goes...

With Adobe CS6, preview rendering is specifically performed by the GPU when you are using one on the Mercury Engine compatibility list. Its a pretty short list. Its done this way for speed. GPUs can render graphics (and other things) much faster than most CPUs by design. The memory bandwidth is much higher, and theyre basically just doing one task, which aids in how fast they operate. When you play back a video is Premiere Pro CS6 through the timeline, it previews and pre-renders files automatically. When you go to export your media, you have the option to use preview files, and that does expedite the process significantly, but it also eats disk space very quickly and not everyone has the disk space for that. If Avid works differently, at this point I couldnt tell you. I will eventually upgrade to Avid, but not for a while, my existing workflow is stable and works quite well.

I think Xeons will really still be a staple in high end editors because not every effect or plugin supports rendering by GPU, so with higher resolution footage, its going to begin equalling out, at least thats what I believe, where the GPU will begin to take on more of the rendering tasks, but I think its still a ways out before the GPU will entirely take over that role. Avid software will typically also push a computer to run at 100% without throttling. This was an issue with audio adapters and C1E/Speedstep causing crackling issues. I have noticed that while rendering, my machine runs at one speed without throttling, but it does not reach the upper limit of the Turbo Boost during that time. Since even companies like AutoCAD are beginning to offer remote rendering services, its really hard to say how long it could take. Ive seen design firms pass on running Xeon boxes in favor of an i7 simply because they take more advantage of remote rendering. For a high level freelancer or even a director that wants to be able to work at home, I can see high powered Xeons living on. For small to medium film companies, its hard to say. The cost to build my machine as it sits right now is hard to justify in one sitting, but lots of companies like Avid still cost a huge amount to create an in house render farm which still may not use GPU rendering.

Adobe CC hasnt changed the way it renders that Im aware of for Premiere, but I know I will go to Avid before paying monthly for software. I can look into Avid, but even a lot of the guys on the DUC a few years ago for Pro Tools were running Xeon boxes. I doubt much has changed. CS6 and Avid have entered the 64 bit level with support for multithreading, so i7s are the base for heavier workloads. College students that are able to work in labs will still mostly see workstations over even a desktop with "gaming" specs. Im not sure if games have breached that step, but last I really knew, higher end i5 CPUs were just as good for gaming as i7s.

With multicore support, GHz still matter, I do not dispute that, but a 2GHz 6 core Xeon like I was running would still render things out faster than a mid range i7. At full tilt, my 10 cores should run at 3.1GHz, no i7 will come close to that for a couple more years minimum, desktop PCs are only just entering 6 physical cores. My machine is on par with a fair amount of the exchange or ESX servers I maintain, which is mind boggling, but from my quick test last night, my machine is able to take full advantage of the hardware. If I wasnt able to take advantage of the hardware, that would definitely be an issue.

As far as cost goes, Xeons are difficult to price because they dont go down for some time, but workstation graphics cards are right up with them in price. Quadro or Firepro cars that the Mercury Engine can take advantage of fully are more than the cost of some mid level Xeons.

I would stand by my method of going with a dual socket motherboard if you were looking for a high end workstation, and buying one Xeon thats the best you can afford. Get the best workstation card you can afford, and then get a second CPU and distribute your RAM for both CPUs, and add more RAM last. I think thats the best way to ensure you'll continually gain additional rendering power. I made it 3.5 years, and think its safe to say Ill get another 3.5 years minimum. The costs are on par with building a new mid range gaming build every 3 years, so when you do the math, its no more expensive when you look at the long term costs.

So in short, Xeons are still the cheaper option right now, but I think it could still be year or two before things turn around further.

Wasn't there some bullshit around what's a compatible gfx card where only the workstation cards were on Adobe's approved list...but all it took was a registry edit to get the programs to use gaming cards?

Yes, Adobe does "limit" what graphics cards that are compatible, but you can add your own car to a text file that holds the list of what is compatible. There is a specific way to enter the name, and entering the wrong name means it still wont work.

Selecting the right codec and workflow does make a huge difference, and on top of that, the article does make mention of effects that cannot use GPU rendering.

The way my current machine runs, Im just looking to find which GPU runs better with the Mercury Engine. A workstation card does well for workstation usage, but gaming cards work better with some simpler tasks like windows aero theme so since Premiere isnt the only program I use this for, I do need a degree of support for more basic functions as well.

Silent PC Review has been providing expert advice and detailed reviews of PCs and peripherals since 2002. Our technical advice has been featured on publications such as: New York Times, O'Reilly, PCMag, Popular Mechanics, Forbes, etc. plus countless trade shows and industry articles. We're dedicated to providing top-notch advice and reviews for choosing your next PC build.