Large Internet companies incur a lot of costs. They have the same expenses other companies have -- office space, payroll and benefits being just a few examples. But they also have an expense most other companies don't: data centers.
Data centers are where Internet companies keep their computer servers. Depending on the company, a single data center might hold hundreds or even thousands of servers stacked in racks. Large companies like Google have several data centers in different geographic regions. The servers in these centers are the heart of Internet companies. They provide information and services to customers.
Data centers are expensive, too. Companies have to purchase or rent space to house servers and, of course, the machines themselves cost money. But what really packs a wallop to the corporate wallet are energy bills. Most data centers consume lots of energy.
It's not just the servers that drain power. Servers generate heat as they run and if they get too hot, they shut down. That's why data centers often have extensive cooling systems to keep everything running smoothly. But most cooling systems also require power and so they contribute to the overall power bill.
Exactly how much energy data centers actually consume is a matter of debate. In 2006, the Environmental Protection Agency (EPA) estimated that data centers accounted for 1.5 percent of all consumption of electricity in the United States. That's about $4.5 billion in energy costs [source: EPA]. That's a big electricity bill!
As demand increases, companies have to add capacity to data centers. That means the cost of electricity could become an even bigger concern in the future. But a team of experts from MIT, Carnegie Mellon University and Web application developer Akamai think they may have found a solution in the form of an algorithm.