What can I say? The case has all those fan hardpoints, so I might as well use them.

They will just create unnecessary noise. The case already comes with a 200mm fan on the front and a 140mm fan on the back, which is more than sufficient for the power dissipation. With that relatively low-powered PSU you can't physically install enough hardware to need anything more, yet you are proposing to install another seven 120mm fans for nine fans total. It takes four 120mm fans combined to have the same airflow as a 200mm fan and they will make more noise in the process. Incidentally a quad socket 1600W enterprise server only uses four fans, albeit at a higher average RPM; I have fourteen 120mm fans on my 2000W dual-socket 8-GPU workstions but those are dissipating five times the power under load. You should drop all the extra fans and the fan controller and get a bigger SSD.

Fair enough. If it does end up having overheating issues (such as from being crammed under a small desk in a flat that turns into an oven in even moderately sunny weather) I can always buy some more later.

And how big an SSD do you think I'll need? I can't remember exactly how much space my preferred distro takes up by itself but probably not a lot.

There are hardly any excesses of the most crazed psychopath that cannot easily be duplicated by a normal kindly family man who just comes in to work every day and has a job to do.
-- (Terry Pratchett, Small Gods)

Replace "ginger" with "n*gger," and suddenly it become a lot less funny, doesn't it?
-- fgalkin

My usage pattern is relatively light on stuff that benefits from being on the SSD: Unlike most people on /r/pcmasterrace I've never once had an urge to blow fifty quid on stuff I never get around to playing every time Steam has a sale. I have however filled most of a 1TB hard drive with music and videos.

Anyway, reworked the build a bit, switching the 7600 out for an 8400, adding a slightly bigger SSD and cutting out the fans and fan hub:

There are hardly any excesses of the most crazed psychopath that cannot easily be duplicated by a normal kindly family man who just comes in to work every day and has a job to do.
-- (Terry Pratchett, Small Gods)

Replace "ginger" with "n*gger," and suddenly it become a lot less funny, doesn't it?
-- fgalkin

In prep for my next round of classes, I've picked up another 500GB and a new 1TB Samsung EVO SSD. The PRO series just does not have the performance benefit to justify the price. I figure I'm going to copy everything off my old SSD to the 1TB, use it for gaming and stripe the two 500s into a 1TB for all my virtual machines. I lost all my original VMs due to some stupid (and drunkenly) moves because I needed space for more junk. HDDs are just excruciating for me to use.

I'd put my games on the striped set, but I'm more concerned about reliability. I mean, we're not doing anything that crazy with the virtual machines, except with Linux, but that install is nothing. And my gaming loads times are already stupidly low.

My 1TB HDD is aging. But I do need something for general storage. Is it worthwhile to go with something like a WD Black for that? I think I'd probably just be better off with like a 3-4TB WD blue. I'm out of the loop on HDDs. I don't have a whole HELL of a lot of data, but I want to make sure I don't lose what I have. I monitor my SMART but even that isn't a guarantee. I suppose I should consider a second 3-4 TB external drive, backup to that once a month and stick it in the gunsafe. Or just mirror some 3TBs, I just hate to have that much overhead, but I guess it's moot if I'll have an external. The benefit to the external is less power/heat.

My last HDD failure would have blown chode if I hadn't stuck the bastard in the freezer and it end up working long enough after to get all my data.

There's no doubt that my machine is in need of an upgrade- it's done really well seeing as the CPU and Motherboard are 8 years old, RAM isn't much newer. I just have to decide what chipset to get, first order being the CPU socket. Either it'll be the 2017-spec LGA 2066, or if that's too pricey 2015's 1151.

It has to make use of SLI, as for brand it's going to be either Gigabyte (what I have now) or Asus. Unless they've really gotten their shit together I won't touch MSI with a bargepole since I've had problems with their boards in the past. The new Antec PSU is more than capable of handling any hardware I throw at it so that's not an issue, and it'll be ATX form factor.

It's no use debating a moron; they drag you down to their level then beat you with experience.

Just because you have the attention span of a fruit fly doesn't mean the rest of us are so encumbered.

"As you know science is not fact"- HuskerJay
"The Delta Fyler [sic] isn't even a shuttle craft" -HuskerJay69
"The Dominion War wasn't really all that bad"- Admiral Mercury

I've always been a big ASUS fan. SLI support is still in the trash-heap last I checked. No Unreal game can support it natively, most devs don't even bother. I think Blizzard has decent SLI support, but no game they make can even touch my 980ti. I think Prey is like one of the only 2017 games to see any type of performance benefit. Though there are some non-gaming tasks it's good/great for, you're generally better off just paying either a bit more (or a bit less) to buy the biggest beast available for the money.

That said, I did enjoy fucking around with SLI just for the experience.

Idiot update: when I built this rig I talked about how I think I messed up the connections, and I did. Both SSDs were on the ASMedia connectors, not the Intel. So, they've been moved and my DVD players are now on the ASMedia. Another bone-head move is I didn't enable RAID over AHCI, so I can't into RAID without making some changes.

Thankfully, with W10 it's as easy as deleting your AHCI drivers and rebooting into safe mode after enabling RAID. However, since I have more than enough space for the time-being and I don't want to risk bricking my W10 install, I'm just going to hold off until I can make a couple backups. As Windows backup have always been.... finicky, I'm going to at least double up because I can't afford to be down for too long here in the near future.

I've got to find my big stack of DVDs at some point. Got lost in the move. Also, my biggun external HDD.

I also just really don't want to reinstall all my shit. I've got some hiccups right now. Windows did not like me moving stuff around. I crashed out my explorer and now is fighting me to run certain programs. It seems to have scared itself straight, but I may be in for a reinstall at some point either way. Bleh. Either way, I'll hit it with a few hammers to see if I can iron things out.

EDIT: ok, pulled the drive letter off the old drive after copying everything to the 1TB. Gave the 1TB the old drive letter. Seems to be working after a reboot.

So, backups and system restore disc, good to go. Deleted the AHCI controllers, rebooted into safe mode aaaand, keyboard and mouse don't work, the login cursor blinks 5 times, then hard-lock. I am... completely confused here. So, I was swapping around keyboards, unplugging all the everything, couldn't get past the login. In retrospect, I could setup automatic login on a local user account with admin access and at least got to the desktop (hopefully) and see if that solved the problem, but I just got tired of hammering at a 5 minute job for 2 hours. My only other option is to reinstall, then restore from my system image and man I am just not big on that. I can just wait until I need a reformat, but even then it might not be worth switching off of AHCI to RAID just to mess around with this.

Honestly, the RAID gains on an SSD aren't all that viable in a real-world environment either way. On sequential read/writes, sure yea: a RAID 0 array will pull far ahead. But I don't do a lot of those and most SSDs perform WORSE in a raid on random read/writes. So, after all that, I now have a "Client VMs" SSD and a "Server VMs" SSD.

I wanted to run a RAID 1 on two 4TB HDDs as the prices are cheap, but as I said before: I SMART monitor and it would probably just be better to run a big internal drive and an equal-size external, backup to that and keep the copy either in my gunsafe or in my parent's safe, in case there was a fire or something.

Oh maybe this whole thing will convince me to dump a mint on some RAID capable NAS.

I was going to write this in its own thread but it seems more appropriate to put it here. I've upgraded and replaced most of the parts on this PC as necessary and have reached the limits on what I can get out of what is in effect an 8-year old PC. The only completely original parts remaining on the outside are the case, speakers, mouse and mousepad. The latter two of which are looking very well-worn.

Inside, the MOBO and CPU are same as when I fitted them- the RAM is a bit newer but still 6 years old, though it does have 24GB. And because it's DDR3 it can't be used in a DDR4 board it too will be replaced.

My research has narrowed my options to the Z370 Intel chipset with a Coffee-lake CPU, and choosing between Asus and Gigabyte boards. Interestingly, for the same chipset the top-end Gig (Aorus Gaming 7) costs the same as the bottom-of-top drawer Asus Maximus Code at £230 , while the very best Asus (Formula) goes for £380.

With the upcoming release of the Z390 chipset it would seem to be a bad idea to invest in any of the really expensive boards right now since their value is bound to plummet once the new chipset arrives.

I have managed to nail down my criteria for the boards so far by two criteria:

Most basically, the form factor must be ATX since that's the industry standard and easiest to build around.

The board must support Nvidia's SLI. I already have a beefy PSU that I got in anticipation of running multiple cards, and having done it once I'd like to have the option of going down that road again.

For Gigabyte that narrows my choices down to four from the Z370 chipset: There's the Gaming 7, Gaming 5 and Ultra Gaming 2.0 (AORUS), and lastly the XP SLI. The Gaming 5 and 7 are virtually identical save for the 7 has an ESS9018Q2C audio chip, and 1 x Rivet Networks Killer™ E2500 LAN chip, in addition to the conventional network port. There’s a £25 price difference between the two.

Far as I can tell, the Ultra 2.0 and the SLI are also very similar, being within £15 of one another. The only real difference I can see between the two Gigabyte groups is that the more expensive ones have two M.2 connectors instead of one.

For Asus it's more complex & expensive; their top ROG Maximus X line (for overclocking), in decreasing price order is the Formula, Code and then Hero. I can't see the case for buying any of these- if I'm going to shell out for even the cheapest of those for the same money I could have the top Gigabyte option.

Seeing as I don't plan on overclocking that hard, if I get any ASUS board it’s more likely going to come from their STRIX lineup, where my options are the E, F and H.

Far as I can tell, the E and F are virtually identical except that the £5 price difference nets the E wi-fi. And despite costing £30 less, the only downside the H has compared to the other two is that it has one fewer PCI-E x1 slot. As the position of the slot is such that it would be blocked-off anyway by the double-width Nvidia card this makes exactly zero difference. Same story with wi-fi- it’s something I neither need or want on a desktop PC.

So far then, it looks like the Asus STRIX-H is the winner, available from both outlets (Ebuyer and Overclockers) that I’m looking at for virtually the same price. But it doesn’t end there, because there’s a complication. Two words: Intel Optane.
See, Gigabyte offer their boards in two flavours (Asus does not)- with the 32GB preinstalled and without. Except this applies to only the G7 and Ultra 2.0, not the G5 or SLI. This poses several problems:

Wbuyer offers both versions, Overclockers only offer without.

Rhe versions including Optane cost about £35 more than without.

Overclockers’ price for the 32GB module on its own is £90. From Ebuyer, it’s £55.

So if I want the best deal on the optane I have to go via Ebuyer, unless their prices rise while I’m doing my research.
I have three options:
Gaming 7 OP: £255 (£225 w/o)
Ultra Gaming 2.0 OP: £187 (£150 w/o)
Asus Strix 370H: £160, with separate OP £215.
CPU options are relatively straightforward, the 8700K comes in 2 flavours, both outlets have them for much the same price so no drama there.

After doing some more research I'm leaning towards the Asus ROG Strix Z370-H above the other two, with a 3.2 GHz 8700K Coffee Lake CPU (£50 less than the 3.7GHz one). Also, the price of the H is less than half that of the top-end X-Formula.

For RAM I'll be going with a Corsair Vengeance 16GB DDR4 kit, speed will be between 3-3.2 GHz. I've read that beyond 2,600MHz you don't really notice any increase in performance- the kits I'm looking at are around the £200 price range, give or take.

Now I've worked out the basics I've stumbled across an unexpected problem- what cooler to use for the CPU. I won't be overclocking too hard but I feel a coffee-lake setup is worth more than the stock cooler. Turns out the price of the thing is only part of the problem, but the available space inside the existing case, an Antec Nine Hundred Two.

Using a ruler I measured the space between the board and the side panel and got a clearance of 17.5 cm. The side panel has a 12cm diameter intake fan which thanks to its mounting position partly hovers over where the CPU cooler would be, because the fan is 2.5cm thick that reduces the available space to 15 cm. So any cooler has to be 15cm or less, so any larger coolers would mean removing the case fan.

Even once I've decided what cooler to use there are two reasons why I'm not going ahead with this build yet:

First, the current setup seems to have stopped misbehaving with random freezes after startup, which was one of the motivating factors for this upgrade. If it starts playing up again though, I'll press on!
Second, although long-obsolete now the existing motherboard was the absolute top-of-the-line, state-of-the-art when I bought it and I'm not ready to part with it yet! Link.

It's no use debating a moron; they drag you down to their level then beat you with experience.

Just because you have the attention span of a fruit fly doesn't mean the rest of us are so encumbered.

"As you know science is not fact"- HuskerJay
"The Delta Fyler [sic] isn't even a shuttle craft" -HuskerJay69
"The Dominion War wasn't really all that bad"- Admiral Mercury

I decided on the i5-7600k @3.8Ghz. i don't really plan on doing any overclocking but i've got a fairly big CPU cooler planned on the parts list that should be able to handle it if i do.

i've also decided to go ahead and bite the bullet and buy the GPU (a GTX 1060 6GB) while i've got a little cash to spare. on the basis that while all the other parts will be kind of sitting around until they can all be assembled, the GPU can replace my current one (GT 730) and boost my gaming fps considerably while i rebuild the bank account.

If a black-hawk flies over a light show and is not harmed, does that make it immune to lasers?

Bloody hell, re-reading my last post I made not one but two huge typos in as many lines!

Just make sure your case can accommodate a large cooler, fortunately the manufacturers will specify the exact dimensions of each one. Seeing some of the pictures of motherboards with massive coolers attached look pretty insane! While I'm fairly confident my case is large enough for most coolers I'd have to remove the intake fan to fit the largest ones, or leave off the side panel.

Although aesthetics are completely irrelevant (I'm not even touching RAM that has LED lighting), there is still something very tempting about pairing this fan with this board.

One problem I've foreseen is that the existing board has two SATA controllers, with 6 normal ports and 4 G-SATA ones (that can only accommodate HDDS). My bulk storage drive consists of a RAID-0 array on the G-SATA controller (2x600GB drives) and I'm not sure will be compatible with the new board, which means creating a backup of everything (luckily I have an E-SATA HDD of sufficient capacity) before having to remake the array. I vividly recall juggling all my drives when I installed the SSD.

It's no use debating a moron; they drag you down to their level then beat you with experience.

Just because you have the attention span of a fruit fly doesn't mean the rest of us are so encumbered.

"As you know science is not fact"- HuskerJay
"The Delta Fyler [sic] isn't even a shuttle craft" -HuskerJay69
"The Dominion War wasn't really all that bad"- Admiral Mercury

I could be wrong, but I don't think you're going to have issues running RAID on a controller ASUS setup just for Disk Drives. However, I've never had any luck transfering a RAID array intact across controllers. Even the same model type really. Though when I tried that, I was working with SCSI stuff, so like Mid-2000s.

I've gone away from larger radiator coolers, especially those that attach to the CPU directly. If you're going to spend that kind of money, get something like this. Less weight, less power, easier to clean fans, cool as well, if not better. NOTE: I just used that link as an example.

i've also decided to go ahead and bite the bullet and buy the GPU (a GTX 1060 6GB) while i've got a little cash to spare. on the basis that while all the other parts will be kind of sitting around until they can all be assembled, the GPU can replace my current one (GT 730) and boost my gaming fps considerably while i rebuild the bank account.

Yea, it will probably be a positive experience being able to pull more than 10FPS in Zork.

GPU arrived and installed. it's a bit of a squeeze on this old former office pc's MB, but i got it in with the only sacrifice being the disconnection of my CD drive (and who uses that anymore) because the GPU is so long it covered two of the MB's stata ports.

the user benchmark reports a decent bump, raising my gaming score from 14% to 48% (and then 50% when i fiddled with the GPU power management).

i can now max out Wot and still get 70fps, i think i'll tweak a few things to bring it up to the 90 i was getting on minimal settings. fallout can also be maxed.

transport fever and children of a dead earth only had a slight increase, i'm thinking those might be read-write speed and cpu limited respectively.

all that's left for the new build to have all the parts it needs to be functional are the cpu cooler, ram, and PSU.

i could just move the PSU from this one, which was the same unit from my last PC and at 430 watts should be able to handle the 296 watts the partpicker lists as the top end requirements, but i also kinda want to get a higher rated modular unit so i'm only using the cords i need and don't have to find a place to tuck unneeded ones, and a higher rated unit would be under less stresses and give me future proofing room.

as for the cooler i did decide on a bit cheaper one, but i'm still going air cooled as i'm hesitant to mess with liquids and the case should have plenty of airflow.

so long story short, the 1060 is now in my current PC, a cpu cooler has been ordered, i'm still dithering on what to do for the PSU, and i think my next purchase will ether be a PSU or the RAM sticks.

If a black-hawk flies over a light show and is not harmed, does that make it immune to lasers?

The new AMD Threadripper 2990WX looks really good, enough that I'm actually tempted to build a single-socket workstation for the first time since 2003. 3 GHz base, 4.2 GHz turbo and 4+ GHz stable overclocks on 32 physical cores is impressive. Of course knowing that the chip has that much headroom just makes it extra frustrating that you can't overclock a dual socket Epyc board, particularly as the Achillies heel of Threadripper is that it only has four memory channels (which to be fair, matches the i9-7900X, but sucks compared to six on the Xeons and eight on the Epycs). The 64MB L3 cache on the 2990WX would utterly destroy the 14MB on the 7900X if it wasn't for the fact that it's a split victim cache with somewhat slow and inconsistent latency.

That said my latest project is a mostly GPGPU powered one, albeit with significant CPU-side logic, so I'm seriously looking at a 2990WX machine (watercooled, modest overclock) with a couple of Titan Vs (alas, I had to give up on AMD GPUs when AMD gave up on the high end).

I might fit watercooling to the GPUs at a later date. The only dicey bit is a RAID 0 M.2 triple-Optane boot drive; apparently it can be done but is a bit fiddly to configure. And yes I know you can't SLI the Titan Vs, but this is primarily for development and I mostly game on the XB1X anyway.

I might fit watercooling to the GPUs at a later date. The only dicey bit is a RAID 0 M.2 triple-Optane boot drive; apparently it can be done but is a bit fiddly to configure. And yes I know you can't SLI the Titan Vs, but this is primarily for development and I mostly game on the XB1X anyway.

Looks kinda extreme. Why not use only 2 optane for boot and 1 optane for cache of the hdd raid? Does the mobo hold 3x M.2?

It is very difficult to believe in a god when some people are never struck by lightning. -Calvin

Why not use only 2 optane for boot and 1 optane for cache of the hdd raid? Does the mobo hold 3x M.2?

Honestly the local HDDs aren't that relevant, all software and hot data will be on the SSD array and large cold datasets go on my file server anyway. Mainly I want them to regularly image the RAID0 array so I can restore it if it fails (which eventually happened on my old dual-Xeon workstation from 2010).

I'm going to delay this build a little though because the Gigabyte X399 AORUS Extreme definitely looks better than the ASUS ROG Zenith Extreme. For one thing it has properly integrated 10GbaseT networking rather than the 'bundled add-in card that takes up a PCIe slot and hampers airflow' of the Zenith. It also has better heatsinks and can more credibly accept 4 GPUs (for possible future expansion). Though with the recent death of 3 and 4 way SLI/Crossfire, it's going to get harder to justify EATX boards for gamers (cryptominers will still want them though).

Why not use only 2 optane for boot and 1 optane for cache of the hdd raid? Does the mobo hold 3x M.2?

Honestly the local HDDs aren't that relevant, all software and hot data will be on the SSD array and large cold datasets go on my file server anyway. Mainly I want them to regularly image the RAID0 array so I can restore it if it fails (which eventually happened on my old dual-Xeon workstation from 2010).

I'm going to delay this build a little though because the Gigabyte X399 AORUS Extreme definitely looks better than the ASUS ROG Zenith Extreme. For one thing it has properly integrated 10GbaseT networking rather than the 'bundled add-in card that takes up a PCIe slot and hampers airflow' of the Zenith. It also has better heatsinks and can more credibly accept 4 GPUs (for possible future expansion). Though with the recent death of 3 and 4 way SLI/Crossfire, it's going to get harder to justify EATX boards for gamers (cryptominers will still want them though).

Ah, yes. I forgot you dwell into the realms of enterprise stuff. Are you going to do some MS / Oracle SQL on it? I think they license by the core and will love that 32 core thingy. As a boot device, I guess you love that 4KQ1 read/write stats and longevity as this is where Optane shines. A good M.2 SSD has better sequential read/write than the Optane, especially at Q16 and more, but that is hardly significant for OS. I have no clue how 3xRAID will impact on this.

Waiting a little for something better can be a tedious task, but I don't think the Asus mobo supports 3x M.2? I can only see 2x M.2 there (but did just a quick check). I'm having a Gigabyte mobo myself and is quite happy. I have some ... issues ... from time to time, but have never managed to pinpoint the cpu, mobo, ram or even the Windows installation itself and it is less frequent than it used to be, so I let it slide.

Your rig looks really nice. I have absolutely no use for that kind of power, but still I want one (if I win a lottery, that is).

It is very difficult to believe in a god when some people are never struck by lightning. -Calvin

Ah, yes. I forgot you dwell into the realms of enterprise stuff. Are you going to do some MS / Oracle SQL on it?

There would be no point having a Tesla V if it was going to run SQL. I do in fact have a quad socket Xeon in the loft for that sort of thing (Postgres rather). The main thing I want to do with this in the short term is prototype a hybrid deep NN / genetic programming / symbolic financial modelling system, which is somewhat inspired/derrived from some recreational AI programming involving narrative analysis that I did in 2016.

I think they license by the core and will love that 32 core thingy.

I do give Oracle credit for preventing server processors from sinking entirely into the insipid realm of 2.2 GHz max cores max power efficiency.

As a boot device, I guess you love that 4KQ1 read/write stats and longevity as this is where Optane shines. A good M.2 SSD has better sequential read/write than the Optane, especially at Q16 and more, but that is hardly significant for OS.

I'm essentially using it for boot, compile and swap and all of those are more sensitive to a significant latency improvement than a slight sequential improvement. Optane still has poor cost/size ratio of course, but I haven't used it yet and I want to get some to find out how it performs on relevant apps.

Waiting a little for something better can be a tedious task, but I don't think the Asus mobo supports 3x M.2? I can only see 2x M.2 there (but did just a quick check).

It supports triple M.2. Bear in mind that they have lots of AORUS motherboards including a previous gen Threadripper one with different specs.

Your rig looks really nice. I have absolutely no use for that kind of power, but still I want one (if I win a lottery, that is).

Gamers will still be better off with a properly overclocked eight core and a couple of top-end game GPUs in SLI/Crossfire. Lots of cores is for people with specific heavily parallelisable workloads.

Well I finally carried out my planned upgrade, I went with the Asus ROG STRIX Z370-H Board. I bought the parts from Ebuyer as they were noticeably cheaper than Overclockers.

Only after placing the order I got an email the next morning saying they were out of stock- I'd assumed as they only flagged as OOS after placing the order I'd got the last one. So the order was delayed by a day while I contacted them to remove the board from the order while sending the rest.

I went to OC since the only boards of that make and model left were a couple of b-grades. I asked what b-grade meant, they told me it meant the packs were unsealed and they couldn't guarantee all the parts were present. They happened to be based near home, so I went there and checked the kit, using the contents list in the manual I found the only missing parts were the SATA cables. As I had plenty of those already this wasn't a problem. I also managed to talk them into doing a price match on the Razer Naga Trinity.

Paired the board with a 3.2GHz 8700K and 16GB Corsair Vengrance 3200MHz RAM. Got a Cooler Master cooler for the CPU and it was a good job I measured the room I had available in the case, had the cooler been ANY taller it wouldn't have fitted underneath the side case fan. As it stands the top of the cooler is exactly level with the underside of the fan, because said fan is mounted diagonally I had 1cm of clearance. I also took the opportunity to replace the rear case fan that had packed in (also Corsair).

I retained the older GPU, and the modular nature of the cables my PSU came with (1,300W High Current Pro) has meant no superfluous wires causing problems.

Because it was meant for M.2 SSDs, the board only had 6 SATA ports instead of the 10 the last one had. Fortunately because I'd taken out of use the smaller of the 2 RAID arrays this wasn't a problem. The eSATA bracket at the back was never needed because the case already had a front eSATA port which was all I needed to connect my external HDD.

I'd foreseen problems that arise whenever RAID is involved so I'd copied the entire contents of the array to said HDD. Despite having eSATA speed it still took hours to copy all 900GB, before having to re-make the array in the new system and copy it all back again.

The machine has a 900GB SATA SSD for the OS, so AFAIK any performance gain I'd see from using an M.2 drive in its place would be minimal. From what I understand Intel Optane only accelerates HDDs when used as a boot drive and would have little effect if the boot drive is an SSD and the HDD is relegated to storage duty.

It's no use debating a moron; they drag you down to their level then beat you with experience.

Just because you have the attention span of a fruit fly doesn't mean the rest of us are so encumbered.

"As you know science is not fact"- HuskerJay
"The Delta Fyler [sic] isn't even a shuttle craft" -HuskerJay69
"The Dominion War wasn't really all that bad"- Admiral Mercury

I went with two Optane drives in RAID0 in the end on the Aorus X399. I decided to give the Titans a miss given that they're clearly going to be refreshed following the RTX 2080 launch; I've stuck a spare Radeon Vega Frontier Liquid in there for now. The Vega is pretty good on FP32 & FP16, so I can probably hold out until 7nm GPUs arrive (I sincerely hope AMD pull in Navi 20 to 2019 or they will be completely out of the higher end of the market). I have to say the DarkBase 900 Pro absolutely sucks for watercooling, bad layout and insufficient clearances all over the place, BeQuiet clearly have minimal actual experience with it. I had to drill a hundred or so extra holes in the top cover to get any usable airflow on the CPU rad at all.