Data center managers need to measure temperature and humidity in a data center, and now they can use small robot by IBM to perform that function. This 1:38 minute video shows the IBM robot and what it can do the data center. The data from the robot is used to develop an efficiency map, which shows temperature differences in the data center and can guide you in making your data center airflow more efficient.

This video of a LEGO Data Center is clearly a detailed homage to a real data center, built and filmed by YouTube user “eduardotanaka.” He used 5,772 pieces, 28 Minifigures, 1 Light Brick, 1 meter of fiber optic strand and spent 8 hours building this unique creation. You can build your own too, using his model and instructions that were uploaded to the Lego LDD gallery at the LEGO web site (search for data center).

He notes that he has plans for future expansion: Facilities and security management with temperature, sound, IR and door sensors as well as automatic locks powered by LEGO NXT are on the drawing board. We think he’s probably going to install real, yet tiny, working IT equipment in the future! This video runs 4 minutes.

CoreSite Realty recently brought new data center space online in Santa Clara, Calif. The company has leased the first completed pod of 18,000 square feet of space at 2972 Stender, the second building in its Santa Clara campus, and plans to bring an additional 33,000 square feet online by the first quarter of 2012. The cooling system used by CoreSite leverages a built-in swamp cooler or Munter unit with air-side economization to extend “free cooling” using outside air. This three-minute video provides a time-lase view of the construction of 2972 Stender, starting with empty land and ending with the completed data center building.

The i/o ANYWHERE® modular data center is delivered as a system and includes 100% of the critical infrastructure required for a high-density, always-on data center. From power delivery to cooling to network connectivity to 24x7xForever infrastructure monitoring, this patent-pending system is a fully integrated data center solution.

Not to be confused with a stand-alone ISO containerized data center design, the i/o ANYWHERE® modular data center system is comprised of up to 20 modules and can be customized to the customer’s requirement. This portable data center can be delivered to the customer’s location, to a dedicated off-site location, or at one of i/o’s data centers in a matter of weeks – not months or years. For more information about i/o’s modular data center, i/o ANYWHERE®, please watch the video above or use the navigation on the left to learn more.

AST says the unit achieves an actual metered Power Usage Effectiveness of 1.09. The climate in Sydney allows the customer to use indirect airside economizers (free cooling) for 92 percent of the year. The system is controlled by management software that can switch between several modes of cooling: 100 percent free cooling at temperatures below 19° C (66 degrees F), partial economization with the use of adiabatic cooling, or mechanical cooling for temperatures approximately above 26° C (78 degrees F). This video provides an overview of AST’s Natural Free Cooling design, with illustrations of airflow and the air-to-air heat exchanger. It runs about 5 minutes, 30 seconds.

A look at the inside of a Google data center in South Carolina, showing tape storage modules. クリックで拡大 ⇒

Google today released a video showcasing the security and data protection practices in its data centers, which includes some interesting footage from the company’s data center in South Carolina. Most of the tour focuses on physical security and access control, including the security gates and biometric tools (iris scanners, in this case). It also showcases Google’s methodology for wiping and destroying hard disk drives when they fail or are taken out of service, including an on-site disk shredder. At about the 4 minute mark there’s the briefest of glimpses of the data center area, which shows tape libraries. This video runs about 7 minutes.

Near the end of the video there’s a reference to Google’s use of additional security measures not shown in the video – which can only be a reference to the sharks with friggin’ laser beams on their heads. Here’s a look at some of the coverage from Google’s previous disclosures about its data centers at the 2009 Data Center Efficiency Summit:

SGI was one of the early players in the container data center sector with its water-cooled ICE Cube portable unit. Last week the company unveiled a retooled ICE Cube modular data center that can be cooled entirely by air. The fresh air cooling allows the unit to run outdoors in cool climates, improving energy efficiency by foregoing mechanical refrigeration.

At the Gartner Data Center Conference, SGI’s Patrick Yantz gave DCK a detailed tour of the new unit. Patrick provides an overview of the new orientation of the ICE Cube module, which allows easy expansion, and demonstrates how SGI’s software management package can remotely throttle fans up and down. This video runs about 13 minutes.

Video: Google’s Finland Data Center Project

We recently noted that Google will use water from the Baltic Sea to cool its new data center in Hamina, Finland. Google has refurbished the water pumps used at the former newsprint plant, and will use large pipes to draw cool water from the nearby Baltic Sea. Google bought the former Stora Enso newsprint plant for $51 million in 2009, and expects to invest 200 million Euros (about $252 million) in the project. A video posted on YouTube provides a look at the inside of the Hamina facility as Google begins retrofitting the massive facility for use as a data center. Google engineers Joe Kava and Alistair Verney discuss the building’s infrastructure and innovative cooling system, which Google says has not been attempted before.

A look at the interior of a former paper mill in Hamina, Finland that Google is converting into a data center.

Google will use cool sea water in the cooling system for its new data center. Google has refurbished the water pumps used at the former newsprint plant, and will use large pipes to draw cool water from the nearby Baltic Sea.

A number of projects use cold water from large fresh water lakes for cooling. Cold water cooling systems that tap nearby bodies of water tend to have a high up-front cost in the pipe work, but offer huge savings over the long run.

Google built its first chiller-less data center in Belgium. Chillers, which are used to refrigerate water, are widely used in data center cooling systems but require a large amount of electricity to operate. With the growing focus on power costs, many data centers are reducing their reliance on chillers to improve the energy efficiency of their facilities. Google, Yahoo and Microsoft have all built facilities that use fresh air to cool their server rooms. In the Hamina example, Google is using chilled water, but effectively using the sea as its chiller.