Motherboard

RAM

GPU

Case

Storage

PSU

Display(s)

Cooling

Keyboard

Mouse

Sound

Operating System

PCPartPicker URL

I am building in a Thermatake Tower 900, I absolutely love this case, when completed with water this PC may be over 120 pounds or more, case alone is 57 pounds or 61.6 pounds in the dam box. I ordered 4 SANYO DENKI 9GV1412P1G001 fans for the cold air intake. 140mm x 38mm. All intake except for back panel will be a sayno fan, could not put on back panel due to clearance issues with 80mm thick rads.
Voltage Rating (V) 12
Operating Input Voltage Range (V) 10.8 to 13.2
Power Rating (W) 55.2
Product Category DC Fan
Nominal Speed (r/min) 7600
Maximum Airflow (CFM) 310
Maximum Pressure 2.57inH2O
Noise Level (dB) 70
Line Current (A) 4.6
For the water cooling it will be Two loops which is 2 24v PMP 600's with a 400mm bitspower res attached to each one. The CPU, and GPU's will each run on its own loop. The PMP 600 have the highest Maximum Head Pressure I have seen 11m or (36ft) and 20L/min (5.3 gal/min), Tubing will be neatly arranged 1/2" x 3/4" Tygon neoprene tubing. Each 560 80mm thick rad with Noctua NF-A14 iPPC-3000 PWM fans in push pull, push only on bottom of rads near PSU due to clearance issues because the 80mm thick rad. The cpu will be on the Watercool MO-RA3 420 PRO 9 x 140mm push pull configuration with 9 Cougar CFD14HBR 140mm fans.
As of right now these are the parts I am waiting for some are out and some are not. The ones in bold I have or have completed.
CPU Have not decided yet
Motherboard Have not decided yet
RAM Trident Z RGB Series DDR4
GPU Two 1080 Ti SLI or Vega CF Have not decided yet
Cooling Dual Custom water loop
Case Thermatake The Tower 900
Storage 960 PRO 512GB M.2 (MZ-V6P512BW) Raid0 + HDD Media storage dont know the configuration I want just yet.
PSU EVGA SuperNOVA T2, 80+ TITANIUM 1600W
Display(s) Vizio M50-C2 4K UHDTV 3840 x 2160
Keyboard Have not decided yet
Mouse Have not decided yet
Sound 9.2 Surround System + ASTRO A50
Operating System Windows 10

Table of Contents
01. 2013-NOV-14: First Hardware Tests & The Noctua NH-U9DX 1366
02. 2013-NOV-16: Temporary Ghetto Setup, OS Installed
03. 2014-APR-01: PSU Mounting & LSI Controller Testing
04. 2014-APR-02: The Disk Racks
05. 2014-APR-08: Chipset Cooling & Adventures in Instability
06. 2014-APR-09: Disk Ventilation
07. 2014-APR-11: Fan Unit for Main Compartment Ventilation
08. 2014-APR-12: Storage Topology & Cabling
09. 2014-APR-26: Storage and Networking Performance
10. 2014-MAY-10: Sound Dampening & Final Pics
PDF Version of this Build Log
http://alpenwasser.net/repository/files/apollo.pdf
Hardware - Final Config
CASE: InWin PP689
PSU: Enermax Platimax 600 W
MB: Supermicro X8DT3-LN4F
CPU: 2 × Intel Xeon L5630 (quadcore, hyperthreaded)
HS: Noctua NH-U9DX - Socket LGA1366
RAM: 24 GB Hynix DDR3 1333 MHz ECC
HBA CARD 0: LSI 9211-8i, flashed to IT mode (Tutorial)
HBA CARD 1: LSI 9211-8i, flashed to IT mode
HBA CARD 2: LSI 9211-8i, flashed to IT mode
SSD: Intel 520, 120 GB
HDD 0: WD VelociRaptor 150 GB (2.5")
HDD 1-3: Samsung HD103UJ 1 TB F1 × 3
HDD 4-7: WD RE4 2 TB × 4
HDD 8-13: WD Red 3 TB × 6
Total Raw Capacity: 29 TB
Pics of Final Form - More in Final Post
(click image for full res)
(click image for full res)
Wait, What, and Why?
So, yeah, another build. Another server, to be
precise. Why? Well, as nice of a system ZEUS is, it does
have two major shortcomings for its use as a server.
When I originally conceived ZEUS, I did not plan on using
ZFS (since it was not yet production-ready on Linux at that
point). The plan was to use ZEUS' HDDs as single disks,
backing up the important stuff. In case of a disk failure,
the loss of non-backed up data would have been acceptable,
since it's mostly media files. As long as there's an index
of what was on the disk, that data could easily be
reaquired.
But right before ZEUS was done, I found out that ZFS was
production-ready on Linux, having kept a bit of an eye on it
since fall 2012 when I dabbled in FreeBSD and ZFS for the
first time. Using FreeBSD on the server was not an option
though since I was nowhere near proficient enough with it to
use it for something that important, so it had to be Linux
(that's why I didn't originally plan on ZFS).
So, I deployed ZFS on ZEUS, and it's been working very
nicely so far. However, that brought with it two major
drawbacks: Firstly, I was now missing 5 TB of space, since I
had been tempted by ZFS to use those for redundancy, even
for our media files. Secondly, and more importantly, ZEUS is
not an ECC-memory-capable system. The reason this might be a
problem is that when ZFS verifies the data on the disks, a
corrupted bit in your RAM could cause a discrepancy between
the data in memory and the data on disk, in which case
ZFS would "correct" the data on your disk, therefore
corrupting it. This is not exactly optimal IMO. How severe
the consequences of this would be in practice is an ongoing
debate in various ZFS threads I've read. Optimists estimate
that it would merely corrupt the file(s) with the concerned
corrupt bit(s), pessimists are afraid it might corrupt your
entire pool.
The main focus of this machine will be: room to install more disks over timeECC-RAM capablenot ridiculously expensivelow-maintenance, high reliability and
availability (within reason, it's still
a home and small business server) Modding
Instead of some uber-expensive W/C setup, the main part of
actually building this rig will be in modifying the PP689
for fitting as many HDDs as halfway reasonable as neatly as
possible. I have not yet decided if there will be painting
and/or sleeving and/or a window. A window is unlikely, the
rest depends mostly on how much time I'll have in the next
few weeks (this is not a long-term project, aim is to have
it done way before HELIOS).
Also, since costs for this build should not spiral out of
control, I will be trying to reuse as many scrap and spare
parts I have laying around as possible.
Teaser
More pics will follow as parts arrive and the build
progresses, for now a shot of the case:
(click image for full res)
That's all for now, thanks for stopping by, and so long.