Large Internet companies incur a lot of costs. They have the same expenses other companies have -- office space, payroll and benefits being just a few examples. But they also have an expense most other companies don't: data centers.
Data centers are where Internet companies keep their computer servers. Depending on the company, a single data center might hold hundreds or even thousands of servers stacked in racks. Large companies like Google have several data centers in different geographic regions. The servers in these centers are the heart of Internet companies. They provide information and services to customers.
Data centers are expensive, too. Companies have to purchase or rent space to house servers and, of course, the machines themselves cost money. But what really packs a wallop to the corporate wallet are energy bills. Most data centers consume lots of energy.
It's not just the servers that drain power. Servers generate heat as they run and if they get too hot, they shut down. That's why data centers often have extensive cooling systems to keep everything running smoothly. But most cooling systems also require power and so they contribute to the overall power bill.
Exactly how much energy data centers actually consume is a matter of debate. In 2006, the Environmental Protection Agency (EPA) estimated that data centers accounted for 1.5 percent of all consumption of electricity in the United States. That's about $4.5 billion in energy costs [source: EPA]. That's a big electricity bill!
As demand increases, companies have to add capacity to data centers. That means the cost of electricity could become an even bigger concern in the future. But a team of experts from MIT, Carnegie Mellon University and Web application developer Akamai think they may have found a solution in the form of an algorithm.
Electrical Three-card Monte
How would you reduce your own electricity bill? You might try to turn off electrical appliances when you aren't using them. You could replace power-hungry devices with more efficient versions. But ultimately your power bill would depend heavily upon the price of electricity.
Electricity prices change over time. In some regions, the price changes from one hour to the next. That's where large Internet companies could stand to save millions of dollars.
These companies often have multiple data centers in different geographic regions. Some have data centers around the world. Electricity prices vary widely between different regions throughout the day. If an Internet company could switch operations from one data center to another, it could take advantage of lower energy prices. This could save the company millions of dollars. The group of experts estimated that existing systems could reduce energy costs by at least 2 percent. A really flexible company might see savings in the 35 to 45-percent range [source: Qureshi, et al.].
For this to happen, the company must have several data centers. It also must practice redundancy -- that's when you make sure you have multiple copies of your system so that if one goes down you can switch to another without an interruption in service. There would also need to be an appreciable difference between the energy consumed by a system while operating under a full load versus when idle. If a data center consumes just as much power when idle as it does operating at full capacity, there's no point in switching between systems.
The experts created an algorithm that could identify the cheapest energy prices across a list of regions and route traffic to the most-affordable data centers. Actual energy consumption wouldn't change -- this isn't a green initiative. The experts only looked at monetary costs. But companies could pair the algorithm with strategies for more efficient data centers.
To take advantage of this algorithm, Internet companies will have to make sure they can switch operations from one center to another without affecting service. This might require the companies to consume more energy than normal. They may need to build out data centers to ensure redundancy. Switching may mean an increase in the physical distance between clients and servers, which can affect performance. Companies will have to balance the savings they could make in energy costs with the impact it could have on customers.
It'll require a lot of work on the back end of these systems to use the algorithm effectively. But that work could have a huge payoff. The experts estimated that Google's electric bill for its data centers was more than $38 million per year. Reducing that cost by only 3 percent would result in more than $1 million in savings annually. That could be enough to justify the effort.
Learn more about algorithms and Internet companies by following the links on the next page.
Related HowStuffWorks Articles
More Great Links
- NMS Papers by Asfandyar Qureshi
- Environmental Protection Agency. "EPA Report to Congress on Server and Data Center Energy Efficiency." Aug. 2, 2007. (Dec. 9, 2009) http://www.energystar.gov/ia/partners/prod_development/downloads/ EPA_Report_Exec_Summary_Final.pdf
- Knight, Will. "Energy-Aware Internet Routing." Technology Review. Aug. 17, 2009. (Dec. 9, 2009) http://www.technologyreview.com/business/23248/?a=f
- Qureshi, Asfandyar et al. "Cutting the Electric Bill for Internet-Scale Systems." MIT. Aug. 17 2009. (Dec. 9, 2009) http://nms.lcs.mit.edu/papers/sigcomm372-aqureshi.pdf
- Reardon, Marguerite. "Energy-aware Internet routing coming soon." CNET. Aug. 18, 2009. (Dec. 9, 2009) http://news.cnet.com/8301-11128_3-10312408-54.html