Building new whitebox servers for VMware home lab

I have needed to get some more capacity added to the home lab for a while now, but have taken my time. I have been gathering up enterprise servers that are a couple of generations old in the past. These have always done me well but have limited amount of memory in them and upgrading them was pretty expensive, not to mention they are very loud. So I decided to go another direction and build a couple of whitebox servers based on common desktop parts. I’ve been watching for sales and collecting the parts to build them. After finding a couple of good deals lately I finally had all the parts need to build two hosts.

Another thing that I had to make a decision on was if I needed a server class motherboard or would a desktop one work. After thinking about it I came to the decision that a desktop motherboard would work just fine and probably save me a few dollars in the build cost. I almost never use the out of band management access to the enterprise servers that I had at this point and since they are just down in the basement, I can easily run down and access them if needed.

I also did not need the ability to use VT-d so a server board was even less important. I simple needed hosts with good power and more RAM. It really comes down to memory for me, I needed the ability to run more VMs so that I don’t have to turn things on and off.

The Why:

This type of lab is important to me for personal learning and testing out configurations for the customer designs that I work on during the days. I have access to a sweet lab at work but it’s just better to have your own lab that you are free to do what you want, and my poor bandwidth at the house makes remote access kind of poor.

I want the ability to run a View environment, vCloud suite and my various other tools all at once. With these new hosts I will be able to dedicate one of my older servers as the management host and a pair of the older servers as hosts for VMware View. This will leave the two new hosts to run vCloud suite and other tools on.

The How:

I have set the hosts up to boot from the USB sticks and plan to use part of the 60GB SSD drives for host cache. The remaining disk space will be used for VMs. Each host will have 32GB of RAM, this is the max that the motherboard will support with its 4 slots. There is an onboard 1GB network connection that is a Realtek 8111E according to the specs. I can report that after loading vSphere 5.1 the network card was recognized and worked without issue. I had a couple of gigabit network cards laying around that I installed for a second connection in each of the hosts.

The case came with a fan included, but I added another for better cooling and air flow. Even with multiple fans running the hosts are very quiet since there are no spinning disks in them and put out very little heat. I could have probably reduced the noise and heat a bit more by choosing a fan less power supply but they are over $100 and was not a priority for me.

Hardware List:

Here is a list of what parts each server was built with. I was able to build these systems for under $500 and add a good amount of capacity to my lab. I always keep a close eye on SlickDeals for parts that I needed and was able to score some good deals from Newegg, Tigerdirect and Amazon.

For the short time I will be relying on local SSD drives in the hosts and my old Iomega IX2. The Iomega is serving up an iSCSI and NFS share to all my hosts. It has 1TB of capacity but performance is pretty slow once you get more than a couple of VMs running on it. And if you clone something it takes a while.

I also order a pair of Samsung 256GB SSD drives from Amazon. I found them on sale for $154 and free shipping with my prime membership. These are good drives and are suppose to be fast. So I am exploring options on how I will use them. There are several things I am considering like experimenting with the vSphere VSA, the Nexenta community storage VSA or just using them as local datastores. Which ever way I got they will provide a much needed bump in performance.

Long term I need to invest in a better performing shared storage solution. Something like a better Iomega or Synology device would be ideal. But for now they are pretty expensive and I will need to save up funds for them, unless there is a friendly vendor that would like to sponsor me or donate something.

Anyways, I hope that these details can help others that are looking to build a home lab. I will try to get around to do some performance testing on these hosts and post something but that may take a bit with my current schedule. If you have any questions drop me a note in the comments or send me a message.

Brian is a VCDX5-DCV and a Sr. Tech Marketing Engineer at Nutanix and owner of this website. He is active in the VMware community and helps lead the Chicago VMUG group. Specializing in VDI and Cloud project designs. Awarded VMware vExpert status 6 years for 2016 - 2011. VCP3, VCP5, VCP5-Iaas, VCP-Cloud, VCAP-DTD, VCAP5-DCD, VCAP5-DCA, VCA-DT, VCP5-DT, Cisco UCS Design

Brian –
Thanks for the info and post info in this vetted build. I’m in need of a system and have been looking around the net for some more info – post builds to know how the systems actually ended up – and yours sounds like something I’d like to ‘build off of’. I purchased a synology 1812+ a couple of months ago and am planning on using it for my storage needs. What I’m hoping to do is create, on a smaller scale, some of my production enviroment so that I can test a ‘DR’ solution and I think these componets might work. With work possibly helping out in the costs, I feel fortunate. I don’t have alot of room for 2 ‘normal’ towers and wonder if you think that these parts would fit into a mini-atx or small form factor case like a liun or something equavelant? If not, do you have any suggestions for a similiar built type of equipment for a smaller form enviroment? And, thanks again…

Hi.
I’ve been following this thread and like all the parts – but was wishing for a smaller form factor. Did you change any other parts, from Brian’s other than the board to fit into a micro-atx case – and can you suggest the case that you bought?
thanks,
mark

Brian Thanks for the tips on this… Its always helpful when someone had vetted a build. This one turned out great. It was cheap yet powerful enough to be usable and best of all it’s smaller and quieter than my previous AMD builds. Hats off to you!

Great article Brian! I am going to try to build my own mini lab based on this. Quick question. I don’t have a basement to store my servers in. How quiet are your servers? And how much heat do they exert? I live in a small apartment so don’t have much space plus have to keep an eye on cooling. I was thinking of just getting a Mac mini or 2 and using them for my lab, but they are very costly. Thoughts?

Hello, The only thing that needs a specific Motherboard/CPU combination to work is the Direct I/O feature. This is also the one that is rarely used, would not worry about it for lab use. But to verify what your configuration will support you can look up the parts on the VMware HCL.

Yes both of the white boxes that I built are still going strong. Only trouble I had is the Mobo in one of the boxes died as a result of a storm that caused power issues for me. Lesson learned that I need to invest in a UPS for the lab. It was replaced under warranty and been going strong since.

I was not terribly worried about memory speed when purchasing. Figured anything was going to be a vast improvement over the 5+ year old gear I was using. This type of memory was also very cost effective at the time when I built these boxes.

THx for the specs. Do you use your ssd drive only for caching or also for your datastore. I have also à iomega x2 but i want also à “fast” local storage when I test some things. I consider à ssd for local storage or the sata Western Digital WD5000HHTZ 500 GB Harde Schijf
(SATA 600, VelociRaptor . Richard