+1 on the telephone jack alone. +more if I could.
–
RBerteigMay 6 '09 at 8:04

1

Those power sockets should be dedicated; in other words, home-runs back to the breaker box and not daisy-chained like regular power sockets. Dedicated power sockets are often differentiated by orange receptacles.
–
ScottMay 7 '09 at 21:43

3

Orange receptacles are not, as a rule, dedicated. Orange indicates that they are isolated ground. Although, when I just checked [wikipedia][1] to verify this, they say that the color of the receptacle is no longer regulated, but receptacles with an orange triangle on them are isolated ground. Still orange receptacles are typically isolated ground, and red are used to indicate backup power. [1]: en.wikipedia.org/wiki/NEMA_connector#Color_code
–
sherbangMay 23 '09 at 18:25

3

The safe should be Fireproof and Waterpoof. Where there is fire, there is water.
–
chrisMay 3 '10 at 18:47

Large enough to house your cabinet or rack. You should have at least 4 ft. of walking space in front and back, ideally all around. If you can get away with it, plan for the possibility of a second rack in the future.

Secured. You don't necessarily need an armed guard, but at least a good lock. A biometric or card swipe is always good. Home depot has locks that use touch pads so you can assign codes to unlock the door.

Usually, the server room is also the
telco's entry point (demarc), so you'll have
your T1 smartjack's there, your PBX
or phone system, etc. We usually
dedicate one wall and put up plywood
so telco's and providers can mount
their equipment.

Air conditioning is a given. You need to keep the room at around 65 - 75 degrees. A dedicated thermostat is preferred since you don't want the A/C to be shut off in the server room on weekends or at nights.

Power is extremely important. Since your rack is most likely in the middle of the room, you will have cables going across the floor to reach the outlet if they are wall based. If you can have the outlets put on the floor, that's best. If you can't, use some cable covers to avoid tripping over wires. Get dedicated circuits put in for a clean line of electricity. Make sure you have extra outlets on all walls, in a pinch, having access to an outlet can be critical, especially if you need to plug in a laptop or other device.

Keep a small cabinet or shelves where you can store manuals, cables, spare cards, drives, etc. You want this in an easy to access place during installations and troubleshooting. Keep this out of the way in the room, but accessible.

Cable management is critical as well, both in the rack and from the plywood wall. Over time it gets very easy to just plug cables in. If the cable management is there, it's easier to keep things organized and label/mark both ends of all wires, the last thing you want to do is trace wires when your network is down.

For the cabinet itself make sure you have adequate UPS's, cabinet cable management, a good KVM, a 1U slide-out keyboard/mouse/LCD to save on rack space and plenty of ventilation. Cabinet design is a whole dissertation in of itself!

If the room is closed off, make sure you have proper ventilation for air flow. You'll need some kind of intake vent so hot air can escape. If needed, use a fan to suck the air out. For fresh air, you can put a vent on the door.

Definitely a phone near the cabinet with a list of support numbers, "911" contacts, etc.

If I can, I try to have a place to hook up a laptop close by so you can access tools, test against another working system, test client software, etc.

And there's nothing wrong with a chair for when you are waiting on hold for that tech support rep to come back on the line :)

There's a lot that can go into a server room, if you can get away with a lot of this, your life as an admin will be so much better. The easier it is to get to equipment, trace the setup and get your problems solved, the more effective you can be. Good luck!

I was just watching the film 'Eagle Eye' - apparently the perfect server room involves covering the walls with oddly-coloured fishbowls, which talk via infrared (???) to your main 'CPU', which itself moves around on a robotic arm with a glowing 'eye' set into the middle. Oh and build the whole thing over a large body of water too, this will help in some way ;)

Look at everything from a risk management point of view and everything will fall into place.

Physical security: What is at risk if a malicious (or ignorant) individual gains access to the server and network hardware? Who will have the permission to enter? Server hardening required? (disable removable drive bootup, BIOS password, disable USB, etc.)

Climate control: 5 servers and 30 PCs won't make incredible heat in, say, a 20x20 room, but that's a bit much if you're stuffing it in a coat closet. Running at elevated temperatures and/or humidity will shorten the life of your hardware and lead to data loss and expensive replacements. Consider simple ventilation with a dehumidifier or possibly A/C system sized for your needs.

Business continuity: Battery backup? Data redundancy? Fault tolerant LAN/WAN connections? Any single points of failure in your infrastructure? Do you have enough excess power to run your infrastructure and not blow a fuse if someone plugs in a vacuum cleaner?

Growth: Have a contingency plan in place for when management demands you double, nay, triple your infrastructure. How will all the critical dependencies scale?

From what I am reading here most people are going for massive over kill. You have 5 server and 30 work stations, so since this is a small company by the sounds of it I very much doubt the boss/owner will spring for a biometric scanner, pass card scanner and video system as some examples unless you already have these and it would be a cheap add on for the server room.

So I did one almost the same as what you had, 20 workstations & 8 servers.

Here is what I found went well and was a good cost on the typical limited budget of a small company.

Phone or at least a phone jack where you can move a phone in should you have to call tech support while in front of the physical server

A/C, even with 8 server my room temp was about 31 degrees Celsius.
For this since the room was in the center of the building and new duct work was out of the question we got one of the portable room ones and exhausted it into the office on the side of the wall closest to the cold air return for the building a/c unit. This worked well and dropped the temp to about 23 degrees but it does take up a lot of space but is really good.

Don't bother with a rack mount monitor/KVM, I can think of better things to spend $1k on.

Get a monitor and a 10 ft VGA cable. Have a table or wall mounted shelf next to the servers, any modern rack mounted server has found mounted VGA port

Get a wireless keyboard/mouse that use the same dongle. Just move the VGA cable and wireless dongle to the computer you need access too. Since you will be accessing everything remotely 90% of the time its not a big deal to do it this way

If you can get a rack and use rack mounted servers, BUT make sure you get rails that allow the server to be slid out should you need to access the hardware rather then one then needs to be unbolted

If you can't get a rack, go to home depot and get the heavy duty freestanding utility shelves, this will work just as well for desktop units

Lock, a key or keypad one is fine, you'll have to give a key and the combo to your boss or owner anyway, make sure the servers log who logs into them and it doesn't matter as much who is in the room.

If the phone gear is there use the plywood on the wall idea

Put the phones on their own power souce UPS

Get a UPS that is expandable for the servers

Get a UPS that has load banks you can remotely switch off should you need to kill server remotely (Tripplight has these, saved me a few times when I had a wonkly server that would lock up and needed the power killed) Make sure you set the bios to power on after a power outage for all servers. The UPS will only have a few plugs so you have to put your critical servers on it, for me it was a DC/GC and email, this allowed me to reboot those if they crashed

Some shelves for parts and the other "stuff" you'll be required to keep in there by the boss (lol unless you are the boss)

Dedicated power for the A/C

Dedicated power for the Phones

Dedicated power for the servers
I had a totally of 6 15amp circuits in mine, plus the lights

Make any patch cables custom and have them as short as they can be, this will have keep it organized so you don't have them dangling everywhere.

If you cannot put it on the second floor, put all the servers on the top rack and work your way down. mine was on the second floor so not an issue for me.

This is what I had in mine, network wiring, etc was all labeled and well organized as well, how that setup is dependent on where it comes in.

Biggest things, keep it organized, and make sure you have room to expand for future growth.

WHAT!!!! Fill from the top down???? I hope your life insurance is paid up and doesn't have a "Darwin" clause. Unless the racks are anchored (to more than a raised floor panel) you should fill from the bottom up, to reduce the chances of tipping.
–
Brad BruceMay 31 '09 at 1:38

1

All racks should be anchored to the floor period. Doesn't matter where the equipment is in them. And of course make sure the rack can take the weight by it's specs. Sorry I forgot to mention that. Its a matter of mitigating risk, if you cannot put your server rack on the second floor or say its stuck in a basement (as one client I have is setup) then you don't want to start at the bottom which would be most prone to flooding. If you cannot anchor the rack, then UPS's in the bottom to weight them and the servers higher up. I think I've also seen racks with extra feet like out riggers
–
LEATMay 31 '09 at 4:24

2

You do need a KVM (preferably IP KVM) because you don't want to spend time messing about around the back of servers and you also want the capability to work from home (e.g. evenings/weekends)
–
TechboyMay 31 '09 at 14:56

We've opted for using out-of-band cards (eg Dell DRAC5) for our KVM needs, rather than a dedicated IP-KVM or physical KVM. Too many wires cluttering up the rack with a physical USB/PS2+VGA KVM. We have a small monitor + keyboard with long cables in each environment for emergencies.
–
Mike PountneyJun 6 '09 at 2:28

2

Fill from the bottom up. Cold air falls, heat rises. Fill your 'empty' RUs with spacers. This keeps the hot air in the hot aisle. You do have it setup as hot/cold aisles, right?
–
toppledwagonAug 28 '09 at 19:49

I agree with duffbeer703. Raised floors have fallen out of style. When our data center was last upgraded, the admin at the time INSISTED on raised floors. Now when ever anything gets moved half the floor has to be pulled up. We had to add an additional fire suppression zone just for the under floor area. Heavy equipment has to be lifted to get it into the raised area (and many more problems). After that admin was fired. The next one moved the network wiring to ladder racks and we're much happier. Not worth moving the power though. 1 outlet per rack (under the rack...)
–
Brad BruceMay 30 '09 at 16:41

2

+1 just for the comment on cheap data racks. I love APC racks purely because they have U-number markings on every face. It's amazing how easy it is to get that wrong.
–
Mike PountneyJun 6 '09 at 2:17

"Sterile Environment Is A Must" -- sterile meaning aseptic in this context? I believe you mean a low-dust environment; hospital operating rooms are nearly aseptic, and it's hard work to get them to that point.
–
Jesper MortensenNov 28 '09 at 13:16

Power points outside the racks so you can plug in laptops etc easily without disturbing in rack power.

SNMP or other remote controlled power rails

Power rails on each side of a rack should go to a different dist panel so that work can be done without losing power to both power supplies in your servers

Video cameras for additional security

A row of hooks in a secure place either in or near the DC so that people who work in there regularly can leave a jacket

Storage for spares, blanking plates, cables and the like

Possibly spares - especially if your vendor(s) can be persuaded to provide you with common spares in advance of warranty claims/failures (some vendors/re-sellers under some circumstances will provide you with say a couple of hard drives and power supplies. They belong to the vendor, but you can use them if you have a failure and worry about the paper work after the fact.)

Think about DR planning from the get-go. If I had it to do all over again, I'd have separated the network gear (switches and routers) from the servers, and had a shorter (30u is great) so I could have just rolled the rack out the door and onto a truck that time when Verizon couldn't replace a backhoed T1 line for FOUR FREAKING DAYS and we had to move the servers to our DR location. (Not that I'm still bitter.) Also, depending on location, it's a good idea to have at least one 208 or 240V circuit and space to put the spot cooler when a hurricane knocks out your HVAC in August in Texas.

As someone who has rolled a loaded 30U rack on to a truck, it's definitely not something I would want to do ever again for any reason. It really messes with the elevators having that much weight on the rack, and the moving companies aren't equipped to deal with it, even if they are. We tried this when we were moving offices and I have to say it was the scariest experience of my professional life, watching the rack containing all our expensive equipment getting loaded on to the back of a huge truck and bouncing around on the pavement in the process. Better to unrack your gear and move it.
–
Kamil KisielJun 20 '09 at 19:39

They released a video about their "container approach" which, even if only very few companies may have money to do so, have good idea for smaller business (for example, focusing on very efficient power supplies and so on).

Cable management and patch panels are a must - start your room out with everything going to patch panels and your life will be easier in the long run when you add to the system.

And lastly, take the time to set it up right the first time - what others have mentioned about spacing the racks, tables, shelves apart, cable management, etc. If you start out with a complete wreck of a room, it's probably going to just get worse over time.

From my experience, the computer room is the only place at work you can find some peace and no one will come bother you so I'd suggest you get some commodities for in there in case you want some seclusion.

Not a great deal to add to the excellent answers above, but if possible I'd add a second server room, for backup/redundancy.

Provided it's far enough, or protected enough, from the other server room, then you gain a great degree of protection against the less serious disasters that may occur - a server fire causing a whole rack failure, for example.

Naturally, this does not replace a full disaster recovery plan or adequate backups - but does provide a cheap second layer of defence if required.