Prev NEXT

How Data Centers Work

Cooling and Power Concerns

The byproduct of all that computing power? Heat. Think about how warm your laptop gets, and then think about how many components are running in a single server room.
The byproduct of all that computing power? Heat. Think about how warm your laptop gets, and then think about how many components are running in a single server room.
©iStock/Thinkstock

Data centers have to have tight environmental controls and take in or generate massive amounts of power to keep things running. And these are costly.

Since servers and other equipment do not do very well in extreme temperatures, most data centers have huge cooling and air flow systems that consume massive amounts of power, and sometimes water. Sensors have to be in place to monitor environmental conditions so that adjustments can be made.

Advertisement

It's not just temperature that is a problem. Factors like humidity have to be kept in check. In 2011, Facebook had an actual cloud, not the digital kind, form in one of its data centers, resulting in some servers rebooting and power supplies shorting out due to rain inside the building. As a result, they modified their building-management system and made the servers a little more weather resistant.

Racks of servers are often arranged in rows that create aisles where the servers are either all facing each other or all facing away from each other in order to control airflow and temperature more efficiently. The aisle where they are facing is the cool aisle, and the air on the hot aisle is funneled accordingly.

Power consumption is another major concern. It's absolutely necessary that these facilities have constant access to adequate power -- some even have their own power substations. A metric used to judge data center energy efficiency is power usage effectiveness (PUE). It's a calculation of total energy use divided by energy use purely for computation purposes. Yahoo, Google and Facebook's PUE scores are around 1.1 or 1.2 for some of their large data centers, although 2.0 is more typical of the industry. That means half the energy goes for computing and half for other tasks or waste [sources: Mone, Levy]. Consulting firm McKinsey & Company found that the average data center was actually only using 6 to 12 percent of its power to do computation work and the rest was lost idling while waiting for the next surge of traffic, likely due to over-provisioning of resources out of fear of delays and downtime [source: Glanz].

Lots of things are being done to reduce data centers' power and other resource needs. Server rooms used to be kept around 60 degrees Fahrenheit (15.6 Celsius), but the trend in more energy efficient data centers is to keep them around 80 degrees Fahrenheit (26.7 Celsius), at least on the cool aisle, although not everyone has adopted this practice [sources: Mone, Levy]. The servers apparently do fine at this temperature, and it requires less cooling related power.

There's a growing trend to use open air cooling, drawing air from the outside rather than running lots of power-hungry air conditioning units and chillers. Another trend is locating data centers near ready sources of water that can be recycled for cooling use, such as Google's data center in Finland, which uses seawater. Another is to locate data centers in cold climates.

Changes in the actual computing gear can help, too. Many components in data centers leak energy, meaning some of the power they use never makes it to doing actual processing -- it's wasted. Replacing older servers with newer, more energy efficient models obviously helps. But equipment can also be redesigned to require less power. Most data centers use traditional off-the-shelf servers and other equipment, but Google and Facebook both use customized servers. Google's were designed to leave off unnecessary components like graphics cards and to minimize power loss at the power supply and voltage regulator. The panels that contain the manufacturer's logo are omitted to allow better airflow to and from components, and the company makes some of its own network equipment.

Additionally, processors and fans can also be made to slow down when they're not needed. More efficient servers also tend to throw off less heat, further reducing the power consumption needed for cooling. Low-powered ARM servers, originally made for mobile devices but redesigned for server uses, are making their way into data centers, as well.

Usage of applications fluctuates depending upon what is being done at what time on various software and web applications, any of which have different resource needs. Application resource management is important for increasing efficiency and reducing consumption. Software can be custom written to work more efficiently with the system architecture. Server virtualization can also cut down on power consumption by cutting down on the number of running servers.