It offers varying levels of security to companies, ranging from open racks (left) to cages (centre back) to cages with biometric and/or password protection (right). Because of this, it cannot easily arrange the racks into regimented designs, as used in facilities run by sole providers such as Rackspace's UK cloud datacentre.

What it can do, however, is put shells around the racks with in-built chimneys to conduct the hot air up and as far away from the computer room air-conditioning (Crac) cooling units as possible. Maximising the distance from the hot air and the cold air boosts the efficiencies of the cooling systems, according to Telecity.

The problems of heat will grow over time, Rob Coupland, Telecity's chief operating officer, believes. He told ZDNet UK that in five or six years' time, he expects datacentre companies will have to find ways to deal with a higher amount of electricity per rack, as servers grow ever more intensive in power consumption, driven by more and more processors fitting into each server.

Server-simulators

To get a picture of how the datacentre consumes power day-by-day, Telecity has hundreds of server-simulators (pictured) available that it uses to mimic the expected power load. This helps them find kinks in their power distribution systems before customers put their racks in. The modelling lets Telecity stress-test the power systems' response to demand spikes and expected loads.

Data hall in construction

The company is tripling the size of the facility with a new set of 12 data halls. The new building will be built using more modern techniques, which are expected to increase the
power efficiency of the overall site.

The data halls will come online in three stages, with four going live at a time. Each hall will have its own generator and power distribution systems for added redundancy. Telecity could not say when the work will be complete.

Transformer

As with all datacentres, provisioning the extra power to
the site was a challenge. Originally, Telecity ordered 24MVA of power capacity from utility Scottish and Southern Energy, which has a distribution station around a mile
away from the facility's north Acton site. This proved to be difficult, so it changed the order to 60MVA, which was a large enough order for the utility to agree to construct a substation and transformer (pictured) onsite.

Because a National Grid distribution point is around 150 metres away from the
substation, less power is lost between the two during transmission. The power supply was one of Telecity's main considerations when choosing the site for Powergate.

Coupland explained that the co-location provider is in talks with energy suppliers to know "where they are in terms of grid capacity", so they can together plan future datacentres.

Inside Telecity's co-location datacentre

The West London datacentre is set to triple in size, so it can serve as a capacity spillover for Telecity's multitude of Docklands and City datacentres

Read More

Transformer

As with all datacentres, provisioning the extra power to
the site was a challenge. Originally, Telecity ordered 24MVA of power capacity from utility Scottish and Southern Energy, which has a distribution station around a mile
away from the facility's north Acton site. This proved to be difficult, so it changed the order to 60MVA, which was a large enough order for the utility to agree to construct a substation and transformer (pictured) onsite.

Because a National Grid distribution point is around 150 metres away from the
substation, less power is lost between the two during transmission. The power supply was one of Telecity's main considerations when choosing the site for Powergate.

Coupland explained that the co-location provider is in talks with energy suppliers to know "where they are in terms of grid capacity", so they can together plan future datacentres.