FAQ: How we built our own server for shared storage on the (relatively) cheap

Earlier this year I set up our own server as a shared storage testbed. Since we got it up and running, I've been asked a lot of questions about it so I put together this quick guide that covers just about everything about it from the parts list to performance testing.

Why did you build your own server?

We were evaluating shared storage solutions for the wider team and realized we weren't quite ready to take the leap into a system that could support 10+ editors working with multiple streams of RED footage. It'd require a lot of hardware and major infrastructure upgrades.

We decided to pivot our focus to a smaller project where we could make a big impact with less work. Thus the graphics server project was born.

Why is shared storage important?

We have a two person internal graphics/VFX department and they collaborate a lot with each other and our team of editors. Using our current storage workflow (individual project drives) means there's a lot of passing physical drives back and forth between computers and that can get really confusing.

The graphics team also needs to work in parallel with our editors, so giving them their own dedicated storage space is helpful so they're not waiting on drives to be free or sending files that need to be organized later.

Because the server is networked, the whole team gets access to the server so folks can send assets and browse for content they need on their own time. Just because someone's out of the office doesn't mean you can't get a render you need to get your edit done.

Can you edit video off of it?

It's not designed to do that, but technically one or two people could edit off the network and have an ok experience. The performance would be similar to working off an external USB 3.0 drive.

Because this unit has a 10Gbe capable SFP+ port, you could create a 10 gigabit network that would support a lot better performance (in theory) so multiple people could edit high resolution footage all at once. I've been told than 10Gbe infrastructure can support a team of 10 editing 6K RED footage no sweat so the limitation might be the processing power of the NAS.

It sounds promising though and I would love to test that scenario soon.

What components go into the server?

Our graphics server is built around a rack-mounted QNAP NAS. After a lot of research, I decided to go with the Qnap TS-431XeU-2G. It's a four bay NAS with dual gigabit ethernet as well as an SFP+ port for 10Gb ethernet capability.

It's running with 4 WD Gold 4TB enterprise hard drives in a RAID 10 array for a total of 8TB of usable storage. This setup means that we get the redundancy benefits of RAID as well as the speed benefits of striping. Using this setup we max out our ethernet connections with about 250 MB/s read/write speeds over the network.

The server lives in a compact 6U enclosed rack which makes the already quiet NAS nearly silent. We also have a 1U battery backup system from CyberPower. This system is connected to the NAS over USB so that in the event of a power outage, the battery will trigger a normal shutdown to protect the server.

How much did it cost?

NAS: $550
Hard drives: $700 ($175 x 4)
Accessories: $300

Total: $1550

How complicated was it to set up?

If you're the go-to technical support person for your family's network issues, you probably have enough technical skill to set this up. The QNAP OS is very easy to use and well documented plus they have solid tech support via phone.

It's a popular NAS manufacturer and model so there are lots of good online resources from other users.

My basic workflow for getting up and running was simple:

Initialize the RAID

Configure the network so the server has a static local IP address

Disable all services except for AFP (we only have Macs at the office)

Set up the shared storage space (single empty folder)

Set up our team with individual logins

I also spent some time setting up a cloud backup using the built in configurator and made sure to enable the 'network recycle bin'. Both will help mitigate accidental data loss.

Including physically building the rack enclosure and getting everything installed, it took me an afternoon to go from unboxing to speed testing.

How complicated is it to use?

Once you log in to the server it shows up just like any other external drive. If someone is using a file, you won't be able to open it but otherwise it acts as you'd expect it to.

Deleted files go to the network recycle bin on the drive instead of your local bin.

It's also been extremely easy to admin. Since it was set up in March I've only had to troubleshoot a problem once and it turned out the user's ethernet adapter had been inadvertently disabled. I've had zero hardware, software, or networking issues with the server itself.

There's an easy to follow setup and troubleshooting guide on our internal helpdocs site which helps me save time answering basic questions or getting new folks logged in.

Do people like using it?

I'm biased (and technical) so it's easy for me to say how awesome it is, but ultimately I wanted a system that would be a easy to use for everyone. Here's what Kris from our graphics team has to say about it:

"Having a server system with automated/redundant backups definitely helps give us some peace of mind and frees us from having to commit to a more manual process. On top of that, the ease with which we can share resources and project assets is a huge plus."