http://www.economist.com/specialreports/PrinterFriendly.cfm?story_id=...
Where the cloud meets the ground
Oct 23rd 2008
From The Economist print edition
Data centres are quickly evolving into service factories
Correction to this article
IT IS almost as easy as plugging in a laser printer. Up to 2,500 servers—in essence, souped-up personal computers—are crammed into a
40-foot (13-metre) shipping container. A truck places the container inside a bare steel-and-concrete building. Workers quickly connect it to the electric grid, the computer network and a water supply for cooling. The necessary software is downloaded automatically. Within four days all the servers are ready to dish up videos, send e-mails or crunch a firm’s customer data.
Illustration by Matthew Hodson
This is Microsoft’s new data centre in Northlake, a suburb of Chicago, one of the world’s most modern, biggest and most expensive, covering
500,000 square feet (46,000 square metres) and costing $500m. One day it will hold 400,000 servers. The entire first floor will be filled with
200 containers like this one. Michael Manos, the head of Microsoft’s data centres, is really excited about these containers. They solve many of the problems that tend to crop up when putting up huge data centres: how to package and transport servers cheaply, how to limit their appetite for energy and how to install them only when they are needed to avoid leaving expensive assets idle.
But containers are not the only innovation of which Mr Manos is proud.
Microsoft’s data centres in Chicago and across the world are equipped with software that tells him exactly how much power each application consumes and how much carbon it emits. “We’re building a global information utility,” he says.
Engineers must have spoken with similar passion when the first moving assembly lines were installed in car factories almost a century ago, and
Microsoft’s data centre in Northlake, just like Henry Ford’s first large factory in Highland Park, Michigan, may one day be seen as a symbol of a new industrial era.
Before Ford revolutionised carmaking, automobiles were put together by teams of highly skilled craftsmen in custom-built workshops. Similarly, most corporate data centres today house armies of “systems administrators”, the craftsmen of the information age. There are an estimated 7,000 such data centres in
America alone, most of them one-off designs that have grown over the years, reflecting the history of both technology and the particular use to which it is being put. It is no surprise that they are egregiously inefficient. On average only 6% of server capacity is used, according to a study by McKinsey, a consultancy, and the Uptime Institute, a think-tank. Nearly 30% are no longer in use at all, but no one has bothered to remove them. Often nobody knows which application is running on which server. A widely used method to find out is: “Let’s pull the plug and see who calls.”
Limited technology and misplaced incentives are to blame. Windows, the most pervasive operating system used in data centres, allows only one application to run on any one server because otherwise it might crash. So IT departments just kept adding machines when new
1 of 4
7/8/2009 10:36 AM
Economist.com
http://www.economist.com/specialreports/PrinterFriendly.cfm?story_id=...
applications were needed, leading to a condition known as “server sprawl” (see chart 3). This made sense at the time: servers were cheap, and ever-rising electricity bills were generally charged to a company’s facilities budget rather than to IT.
To understand the technology needed to industrialise data centres, it helps to look at the history of electricity. It was only after the widespread deployment of the “rotary converter”, a device that transforms one kind of current into another, that different power plants and generators could be assembled into a universal grid.
Similarly, a technology called “virtualisation” now allows physically