I have some new equipment to go into my server room, all rack mounted. The old equipment coming out is not rack mounted. I figure I should get in idea of where things are going to go before I start moving things around. I plan to put the heavy things at the bottom, so I don't have to lift them so high, but, is there a certain amount of space to leave between each device? Should I Ieave 1U or 2U between each device, or just screw them all in on top of each other? Also, if leaving space between each device, is there a good method for keeping them spaced while getting it screwed in, or just having enough people to hold things up while someone screws them in?

18 Replies

Any server items should be designed to flow from front to back, so you should be able to stack them directly on top of one another. Some of my stuff is spaced out a bit, some of it is right on top of one another. I have 3 towers in the bottom, then the UPS is a few inches above those, then probably 5U seperating that from the next server on the list.

All being said, really depends on your cooling solution. If it's just ambient room temperature, then it really doesn't matter much.

I'm not sure there is a "Best Practice" for racking your equipment since each site could be different. My general rule of thumb is plan out our rack. Are you planning on adding any more devices to your rack? If so does it matter if it sits next to a particular server or device? Within a rack everything is close enough.

You have a good start as to put your heavy stuff on bottom. I have my UPS (20 batteries on the bottom). It does make it tough to move teh entire rack so plan spacing around your rach accorrdingly as well.

When it comes to 1U or 2U in between I stack them right on top of each other leaving nothing in between. The cooling for servers flows from front to back so that isn't an issue. Even when tacked on each other, "rack" designed servers and devices are engineered to be worked on when stacked right next to each other (given you use the cable management arm properly).

If you have any specific questions, let me know and I'll help as best I can.

I have two racks & typically separate my servers by functions. Once they are in functional groups, I rack them on top of each other, then I have a 2u space between the groups. (ie - email servers, 2 u space, ERP servers, 2u space etc).

Another thing that I have done is usually put 2-6u between my UPS' & the first server in the rack. That is for two reasons: 1 - gives me room to add more UPS batteries if needed & 2 - they can get hot, so I like to keep them separate.

As mentioned, servers are designed for front-to-back, so don't worry about stacking. (although some swear that it is better to have at least 1u of space on each side of an external drive array).

I use Visio ahead of time to plan out my rack & voila, I'm good to go.

I have three half racks bolted to the floor. The first from the bottom up is UPS, servers, switches, router, firewall. The second is all the patch panels at the top, monitor and keyboard in the middle and UPS at the bottom. The third is all of the phone system stuff. The phone system and switches each feed into the patch panels in the middle.

Actually, the office came like this. When we moved in I didn't have anything that was rack mounted. We're migrating to rack mount as we get new things. I just had to replace the prior tenants switches and try to stack all my stuff in the hole that was left.

There might be issues to consider where the types of equipment will have impact on thermal management within the rack if it is full height this might be more important than it is with half racks.

Your existing method of, well understanding of the method of arranging things in the rack is good, but it may be the case that half racks were chosen for the other racks to avoid needing rack cooling as I mention earlier.

Those are usually called telco racks, relay racks, or two-post racks. Half-racks are usually half-height or the top or bottom half of a full four-post rack.

Putting servers in a telco rack can be a challenge. Some companies make center-mount rails for their servers, but IMHO these are flimsy. They also are less likely to allow you to extend the rails to work on the servers without removing them from the rack.

A solution I use and recommend is to go to racksolutions.com and buy some of their two-post conversion kits. These "ears" bolt to the front and back of the rack and convert it to the equivalent of a four-post rack. That way you can use the standard rails that ship with the servers. The racked devices are supported on all four corners and don't wobble. Just make sure the racks are bolted to both the floor and the wall before loading them up.

Actually this is kind of helpful to me as well. We purchased a rack about 18 months ago and, in a single morning that went by way way way to fast, just slapped everything in the rack and that was that.

Well, as you all can probably foresee, we have a cabling mess and as we expand we are starting to see issues that make maintenance a headache.

So my solution was to get some sheets of graph paper and count off 42 squares which equaled the total unit height of the rack by 6 squares. I got measurements of all of my equipment (there are some rackable and some non-rackable items within that rack) and started doing different layouts on the graph paper by using a pencil to outline the different squares to represent the I finally found one that reduced cabling around 50%, allowed for additional items to be installed in the rack better (our UPS batteries are currently not in the rack) and made everything easier to manage.

One thing to remember, though, is to lay out what the front and back of the rack will look like. We have our switches in the rack (at the top), so we had to make sure those didn't overlap any full length equipment or they would have to be removed . . . and that's a whole other can of worms.

Another thing to keep in mind is your electrical load. There's nothing worse than loading up a rack and then discovering that you can't plug everything in!

Get the power requirements for every piece of equipment and add them up to make sure your UPS(s) can handle the entire load. Then also determine the number AND TYPES of electrical connections needed. Remember that most servers have at least 2 power supplies that will each need a place to plug in - and to maintain redundancy, it would be best if they could plug into separate circuits. Some equipment needs 15 amp circuits while others need 20 amp or even 30 amp. Do they need the rack pdu style plugs or wall socket style?

Label everything as you put it in. Label both ends of ALL cables! Label each piece of equipment front and back so that you can easily identify each one. It is pretty embarrassing to reach into a rack and unplug the wrong server!! (Of course, if that happens, just act surprised when the users start calling and tell them you'll check into the problem!)

that already has many common devices (but you can add your own also) with options for power loading and weight.

Speaking of weight, not many consider the weight of a fully loaded rack. This is especially important on upper floors. Also just because concrete is hard doesn't mean it can handle any amount of weight, even on a ground floor and especially in a concentrated area.

Also as someone related to above, cable management is important. Too long and the slack gets in the way, too short,... well you know that one, but make sure you have enough to slide components in/out rack for maintenance and replacements.

I think UPS and heaviest servers need to be on the bottom. Less chance of injury to yourself or the server. Whichever your management deems more important.

Then, it should look pretty while being functional. While diagramming, picture visitors coming in and looking around. You want it presentable, and to have easy access to everything.

Sort servers by relation and need. The spacing depends on how much room you have to play with. As far as putting everything in. I always prefer at least two people to actually mount in the rack. Less chance to have something go horribly wrong. If it does, you can always blame the other guy.