How Data Centers Work

You may not think about it, but the data you access online every day travels through data centers.
You may not think about it, but the data you access online every day travels through data centers.
©iStock/Thinkstock

There was a time when our information needs were simpler. We had TV shows broadcast into our homes at set times on just a handful of channels, we typed up memos and letters in triplicate for paper distribution and backup, and we had conversations on phones wired to the wall. Even cell phones used to be used just for making calls.

But since the dawn of the Internet, high-bandwidth broadband, smartphones and other new technologies, we are constantly online and constantly demanding that data be delivered to our computers, gaming systems, TVs and our phones. While paper documents still exist, we get lots of what used to be paperwork in the form of e-mail, Web pages, PDFs and other digitized files generated by software and rendered on computer screens. Even books are going from pulp to images on our computers, mobile devices and e-readers.

Advertisement

Electronic exchange of data is required for just about every type of business transaction, and is becoming the norm for many of our personal interactions. Even things that used to be analog, like TV broadcasts and phone calls, are largely delivered in digital form over wires and radio waves. And at a far greater volume than ever before. Whether it's government forms or instructions for baking a tuna casserole or a streamed TV show, we want to be able to call it up online, and we want it now.

With this massive demand for near-instantaneous delivery of digital information came the need for concentrations of computer and networking equipment that can handle the requests and serve up the goods. Thus, the modern data center was born.

Advertisement

What is a data center?

Data centers are simply centralized locations where computing and networking equipment is concentrated for the purpose of collecting, storing, processing, distributing or allowing access to large amounts of data. They have existed in one form or another since the advent of computers.

In the days of the room-sized behemoths that were our early computers, a data center might have had one supercomputer. As equipment got smaller and cheaper, and data processing needs began to increase -- and they have increased exponentially -- we started networking multiple servers (the industrial counterparts to our home computers) together to increase processing power. We connect them to communication networks so that people can access them, or the information on them, remotely. Large numbers of these clustered servers and related equipment can be housed in a room, an entire building or groups of buildings. Today's data center is likely to have thousands of very powerful and very small servers running 24/7.

Advertisement

Because of their high concentrations of servers, often stacked in racks that are placed in rows, data centers are sometimes referred to a server farms. They provide important services such as data storage, backup and recovery, data management and networking. These centers can store and serve up Web sites, run e-mail and instant messaging (IM) services, provide cloud storage and applications, enable e-commerce transactions, power online gaming communities and do a host of other things that require the wholesale crunching of zeroes and ones.

Just about every business and government entity either needs its own data center or needs access to someone else's. Some build and maintain them in-house, some rent servers at co-location facilities (also called colos) and some use public cloud-based services at hosts like Amazon, Microsoft, Sony and Google.

The colos and the other huge data centers began to spring up in the late 1990s and early 2000s, sometime after Internet usage went mainstream. The data centers of some large companies are spaced all over the planet to serve the constant need for access to massive amounts of information. There are reportedly more than 3 million data centers of various shapes and sizes in the world today [source: Glanz].

Advertisement

Why do we need data centers?

The idea that cloud computing means data isn’t stored on computer hardware isn’t accurate. Your data may not be on your local machine, but it has to be housed on physical drives somewhere -- in a data center.
The idea that cloud computing means data isn’t stored on computer hardware isn’t accurate. Your data may not be on your local machine, but it has to be housed on physical drives somewhere -- in a data center.
© Wavebreak Media/Thinkstock

Despite the fact that hardware is constantly getting smaller, faster and more powerful, we are an increasingly data-hungry species, and the demand for processing power, storage space and information in general is growing and constantly threatening to outstrip companies' abilities to deliver.

Any entity that generates or uses data has the need for data centers on some level, including government agencies, educational bodies, telecommunications companies, financial institutions, retailers of all sizes, and the purveyors of online information and social networking services such as Google and Facebook. Lack of fast and reliable access to data can mean an inability to provide vital services or loss of customer satisfaction and revenue.

Advertisement

A study by International Data Corporation for EMC estimated that 1.8 trillion gigabytes (GB), or around 1.8 zettabytes (ZB), of digital information was created in 2011 [sources: Glanz, EMC, Phneah]. The amount of data in 2012 was approximately 2.8 ZB and is expected to rise to 40 ZB by the year 2020 [sources: Courtney, Digital Science Series, EMC].

All of this media has to be stored somewhere. And these days, more and more things are also moving into the cloud, meaning that rather than running or storing them on our own home or work computers, we are accessing them via the host servers of cloud providers. Many companies are also moving their professional applications to cloud services to cut back on the cost of running their own centralized computing networks and servers.

The cloud doesn't mean that the applications and data are not housed on computing hardware. It just means that someone else maintains the hardware and software at remote locations where the clients and their customers can access them via the Internet. And those locations are data centers.

Advertisement

Data Center Scale and Design

When we think of data centers, many of us envision huge warehouses full of racks of servers, blinking and humming away, wires running to and fro. And in some cases we'd be right. But they come in all shapes, sizes and configurations. They range from a few servers in a room to huge standalone structures measuring hundreds of thousands of square feet with tens of thousands of servers and other accompanying hardware. Their sizes and the types of equipment they contain vary depending upon the needs of the entity or entities they are supporting.

There are various types including private cloud providers like the colos, public cloud providers like Amazon and Google, companies' private data centers and government data centers like those of the NSA or various scientific research facilities.

Advertisement

They are not staffed like offices with one person per computer, but with a smaller number of people monitoring large numbers of computers and networking devices, as well as power, cooling and other necessary building facilities. Some are so big that employees get around on scooters or bicycles. The floors have to hold more weight than a typical office building because the equipment can get heavy. They also have to have high ceilings to accommodate things like tall racks, raised floors and ceiling-hung cabling, among other things.

Many companies with heavy online presences have large data centers located all over the world, including Google, Facebook, Microsoft, AOL and Amazon. Microsoft reportedly adds 20,000 servers monthly [source: Uddin], and Google has around 50,000 servers at just one of its many sites [source: Levy].

Google has thirteen big data centers, including locations in Douglas County, Ga.; Lenoir, N.C.; Berkeley County, S.C.; Council Bluffs, Iowa; Mayes County, Okla.; The Dalles, Ore.; Quilicura, Chile; Hamina, Finland; St. Ghislain, Belgium; Dublin, Ireland; Hong Kong, Singapore and Taiwan; as well as lots of mini data centers, some even in co-location sites. The tech giant is also prone to experimenting with design. For instance, around 2005, Google used shipping containers containing server equipment in its data centers, and it has since moved on to other custom designs.

The configuration of servers, the network topology and the supporting equipment can vary greatly depending upon the company, purpose, location, growth rate and initial design concept of the data center. Its layout can greatly affect the efficiency of data flow and the environmental conditions within the center. Some sites might divide their servers into groups by functions, such as separating web servers, application servers and database servers, and some might have each of its servers performing multiple duties. There are no hard and fast rules, and there aren't many official standards.

Or course, some groups are trying to create guidelines. The Telecommunication Industry Association developed a data center tier classification standard in 2005 called the TIA-942 project, which identified four categories of data center, rated by metrics like redundancy and level of fault tolerance. These include:

  • Tier 1 - Basic site infrastructure with a single distribution path that has no built-in redundancy.
  • Tier 2 - Redundant site infrastructure with a single distribution path that includes redundant components.
  • Tier 3 - Concurrently maintainable site infrastructure that has multiple paths, only one of which is active at a time.
  • Tier 4 - Fault tolerant site infrastructure that has multiple active distribution paths for lots of redundancy.

[Sources: DiMinico, Uddin]

In theory, sites that fall into tier 1 and 2 categories have to shut down for maintenance occasionally, while tier 3 and 4 sites should be able to stay up during maintenance and other interruptions. A higher number translates to both a higher level of reliability (meaning less potential downtime) and a higher cost.

The standard also spells out recommendations for cabling, facilities infrastructure (like environmental control and power) and other design concerns. These are aimed at the telecommunications industry but can be applied to other data centers. It is one of the few ways to rate and compare data centers by overall design and functionality.

Not all data centers follow these standards. And the data centers of today are such a new phenomenon that there aren't specific building codes for them in most areas at the moment. They are generally lumped into some other generic type.

Their layouts, equipment and needs are continuously evolving, but there are some common elements you will find in a lot of data centers. Read on to find out more.

Advertisement

Computer Hardware

Though the physical layouts vary, every data center has server clusters.
Though the physical layouts vary, every data center has server clusters.
© Digital Vision/Thinkstock

One physical commonality of data centers is clusters of interconnected servers. They might all be very similar, stacked up neatly in open racks or closed cabinets of equal height, width and depth, or there could be a bunch of different types, sizes and ages of machines coexisting, such as small flat modern servers alongside bulky old Unix boxes and giant mainframes (a fast disappearing breed, but not one that is altogether gone yet).

Each server is a high performance computer, with memory, storage space, a processor or processors and input/output capability, kind of like a souped-up version of a personal computer, but with a faster and more powerful processor and a lot more memory, and usually without a monitor, keyboard or the other peripherals you would use at home. Monitors might exist in a centralized location, nearby or in a separate control room, for monitoring groups of servers and related equipment.

Advertisement

A particular server or servers might be dedicated to a single task or running lots of different applications. Some servers in co-location data centers are dedicated to particular clients. Some are even virtual rather than physical (a new trend that cuts down on the necessary number of physical servers). It's also likely, when you request something via the Internet, that a number of servers are working together to deliver the content to you.

Networking, Software and Environmental Control

Networking and communication equipment are absolutely necessary in a data center to maintain a high-bandwidth network for communication with the outside world, and between the servers and other equipment within the data center. This includes components like routers, switches, the servers' network interface controllers (NICs) and potentially miles and miles of cabling. Cabling comes in various forms including twisted pair (copper), coaxial (also copper) and fiber optic (glass or plastic). The types of cable, and their various subtypes, will affect the speed at which information flows through the data center.

All that wiring also has to be organized. It's either run overhead on trays hung from the ceiling or attached to the tops of racks, or run underneath a raised floor, sometimes on under-floor trays. Color coding and meticulous labeling are used to identify the various wiring lines. Raised floors of data centers generally have panels or tiles that can be lifted for access to get to cabling and other equipment. Cooling units and power equipment are sometimes also housed below the floor.

Advertisement

Other important data center equipment includes storage devices (such as hard disk drives, solid state drives and robotic tape drives), uninterruptible power supplies (UPSs), backup batteries, backup generators and other power related equipment.

Data centers also have lots of equipment to handle temperature and air quality control, although the methods and types of equipment vary from site to site. They can include fans, air handlers, filters, sensors, computer room air conditioners (CRACs), chillers, water pipes and water tanks. Some sites will also put up plastic or metal barriers or use things like chimney server cabinets to control the flow of hot and cold air to keep computing equipment from overheating.

And of course, software is needed to run all this hardware, including the various operating systems and applications running on the servers, clustering framework software such as Google's MapReduce or Hadoop to allow work to be distributed over hundreds or more machines, Internet sockets programs to control networking, system monitoring applications and virtualization software like VMware to help cut down on the number of physical servers.

Advertisement

Some Issues Faced by Data Centers

While monitoring at a data center is vital, it’s highly unlikely that a tech is sleeping near the server clusters. Digital systems are in place to alert staff in the event of an outage or failure.
While monitoring at a data center is vital, it’s highly unlikely that a tech is sleeping near the server clusters. Digital systems are in place to alert staff in the event of an outage or failure.
©iStock/Thinkstock

Data centers strive for providing fast, uninterrupted service. Equipment failure, communication or power outages, network congestion and other problems that keep people from accessing their data and applications have to be dealt with immediately. Due to the constant demand for instant access, data centers are expected to run 24/7, which creates a host of issues.

A data center's network needs are vastly different from those of, say, an office building full of workers. Data center networks are powerhouses. Google's fiber optic networks send data as much as 200,000 times faster than your home Internet service. But then, Google has to handle over 3 billion search engine requests daily, index many billions of Web pages, stream millions of YouTube videos and handle and store e-mail for hundreds of millions of users, among its many other services [source: Levy].

Advertisement

Hardly anyone has as much traffic as Google, but all data centers will likely see more and more usage. They need the ability to scale up their networks to increase bandwidth and maintain reliability. The same goes for the servers, which can be scaled up to increase the capacity of the data center. The existing network needs to be able to handle congestion by controlling flow properly. And anything that is holdling up flow needs to be rooted out. A network will only be as fast as its slowest component. Service level agreements (SLAs) with customers also have to be met, and often include things like throughput and response time.

There are a number of points of possible failure. Servers or networking equipment can go out, cables can go bad or services coming in from the outside, like power and communication, can be disrupted. Systems need to be in place to monitor for, respond to, and notify staff of any issues that arise.Disaster recovery planning is of vital importance in case of major failures, but the minor problems have to be handled, as well.

Advertisement

Planning for Emergencies and Maintaining Security

The system can be set up to reroute traffic in the case that servers or network equipment fail in one area. Traffic can also be load balanced by distributing work evenly over the network and servers to prevent congestion and bottlenecks. Things like data backups, system redundancy and adequate battery backups can also make life easier when outages do occur. Google stores every chunk of data on two or more servers, and really important data is backed up to digital tape. Data centers often have service from multiple Internet service providers (ISPs) for added load sharing and redundancy. If a company has multiple data centers, traffic can even be routed to another facility entirely in the event of complete disaster.

To keep things running smoothly and stay up with current technology, equipment and software need to be upgraded and replaced regularly. Older systems also have to be supported until they are replaced, which hopefully happens well before they are obsolete. The data center needs an infrastructure that makes replacing old equipment and adopting new technology as easy as possible.

Advertisement

Data centers often deal with lots of sensitive or proprietary information, so the sites have to be both physically and digitally secure. They might have gates, security doors, alarms and security staff. Some companies are even loath to disclose the locations of their data centers, as well as any equipment and design features that might be trade secrets. When hard drives fail and have to be disposed of, they might be both erased and physically destroyed so that data doesn't fall into the wrong hands. Networks require security such as firewalls and other methods to keep electronic intruders/hackers out.

Data centers also need emergency equipment like fire alarms, sprinklers or other fire suppression systems to protect people and equipment. The servers, fans and other devices generate a lot of noise, requiring ear protection, and a lot of heat, requiring other employee and equipment safety measures.

Advertisement

Cooling and Power Concerns

The byproduct of all that computing power? Heat. Think about how warm your laptop gets, and then think about how many components are running in a single server room.
The byproduct of all that computing power? Heat. Think about how warm your laptop gets, and then think about how many components are running in a single server room.
©iStock/Thinkstock

Data centers have to have tight environmental controls and take in or generate massive amounts of power to keep things running. And these are costly.

Since servers and other equipment do not do very well in extreme temperatures, most data centers have huge cooling and air flow systems that consume massive amounts of power, and sometimes water. Sensors have to be in place to monitor environmental conditions so that adjustments can be made.

Advertisement

It's not just temperature that is a problem. Factors like humidity have to be kept in check. In 2011, Facebook had an actual cloud, not the digital kind, form in one of its data centers, resulting in some servers rebooting and power supplies shorting out due to rain inside the building. As a result, they modified their building-management system and made the servers a little more weather resistant.

Racks of servers are often arranged in rows that create aisles where the servers are either all facing each other or all facing away from each other in order to control airflow and temperature more efficiently. The aisle where they are facing is the cool aisle, and the air on the hot aisle is funneled accordingly.

Power consumption is another major concern. It's absolutely necessary that these facilities have constant access to adequate power -- some even have their own power substations. A metric used to judge data center energy efficiency is power usage effectiveness (PUE). It's a calculation of total energy use divided by energy use purely for computation purposes. Yahoo, Google and Facebook's PUE scores are around 1.1 or 1.2 for some of their large data centers, although 2.0 is more typical of the industry. That means half the energy goes for computing and half for other tasks or waste [sources: Mone, Levy]. Consulting firm McKinsey & Company found that the average data center was actually only using 6 to 12 percent of its power to do computation work and the rest was lost idling while waiting for the next surge of traffic, likely due to over-provisioning of resources out of fear of delays and downtime [source: Glanz].

Lots of things are being done to reduce data centers' power and other resource needs. Server rooms used to be kept around 60 degrees Fahrenheit (15.6 Celsius), but the trend in more energy efficient data centers is to keep them around 80 degrees Fahrenheit (26.7 Celsius), at least on the cool aisle, although not everyone has adopted this practice [sources: Mone, Levy]. The servers apparently do fine at this temperature, and it requires less cooling related power.

There's a growing trend to use open air cooling, drawing air from the outside rather than running lots of power-hungry air conditioning units and chillers. Another trend is locating data centers near ready sources of water that can be recycled for cooling use, such as Google's data center in Finland, which uses seawater. Another is to locate data centers in cold climates.

Changes in the actual computing gear can help, too. Many components in data centers leak energy, meaning some of the power they use never makes it to doing actual processing -- it's wasted. Replacing older servers with newer, more energy efficient models obviously helps. But equipment can also be redesigned to require less power. Most data centers use traditional off-the-shelf servers and other equipment, but Google and Facebook both use customized servers. Google's were designed to leave off unnecessary components like graphics cards and to minimize power loss at the power supply and voltage regulator. The panels that contain the manufacturer's logo are omitted to allow better airflow to and from components, and the company makes some of its own network equipment.

Additionally, processors and fans can also be made to slow down when they're not needed. More efficient servers also tend to throw off less heat, further reducing the power consumption needed for cooling. Low-powered ARM servers, originally made for mobile devices but redesigned for server uses, are making their way into data centers, as well.

Usage of applications fluctuates depending upon what is being done at what time on various software and web applications, any of which have different resource needs. Application resource management is important for increasing efficiency and reducing consumption. Software can be custom written to work more efficiently with the system architecture. Server virtualization can also cut down on power consumption by cutting down on the number of running servers.

Advertisement

Environmental Impact and the Future of Data Centers

These issues are not just the problem of the companies that create and run the data centers, but also of the surrounding communities and the planet as a whole.

It is estimated that data centers in the U.S. consumed 61 billion kilowatt hours of electricity in 2006, costing around $4.5 billion [source: Uddin], and 76 billion kilowatt hours in 2010 [source: Glanz]. They reportedly account for 1 to 2 percent of electricity consumption worldwide [sources: Levy, Masanet]. By some accounts, some data centers waste upwards of 90 percent of the power they consume due to running 24/7 at full capacity [source: Glanz]. This massive consumption is bound to take a toll on the environment.

Advertisement

One research firm found that the information and communication technology industry accounted for around 2 percent of CO2 emissions worldwide [source: Uddin]. And some data center generators emit air-polluting exhaust that often enough fails to meet clean air regulations.

Changes in this industry are not easy to dictate as there isn't a government agency specifically tasked with tracking data centers. But a lot of the big players, including Google, Facebook, Microsoft, Apple, Yahoo and eBay, are making huge strides toward reducing the resource consumption of their centers, including creating energy efficient designs, using local resources wisely, striving for carbon neutrality and in some cases generating power using greener sources like natural gas, solar energy or hydropower.

There's constant innovation toward efficiency, environmental friendliness, cost effectiveness and ease of deployment. And these days, with Google's newfound openness on its data center designs and projects like Facebook's Open Compute, through which they share hardware designs with the public, the data center superpowers are disclosing some of their innovations so that smaller data centers (and the rest of us) might reap the benefits.

It's hard to estimate the full impact of our online existence, since our own computers and the other networks that get our information to and from the data centers have to be added into the equation. But without attention to energy efficiency and sustainability of the largest and most obvious culprits, the cloud might keep on generating clouds of pollutants and greenhouse gases.

Despite any pitfalls, data centers are not going anywhere. Our desire for constant and instant access to information and media content, for sharing of large amounts of data, for moving things off of our own machines and onto the cloud for access from multiple devices, and for perpetual storage of e-mail, photos and other digital data will keep them around. And they will likely pave the way to an even more wired future.

Advertisement

Lots More Information

Author's Note: How Data Centers Work

I'm amazed at the sheer size and scope of the huge data centers that make our wired world what it is today. I'm also grateful for them, since I'm online most of the time. It was my dream 20 years ago to be able to choose what, when and where I watched shows without being stuck at home at certain times of night. I didn't even imagine the binge watching that I'm doing today, or alternate sources of entertainment like YouTube. But our modern server farms have made those possible, as well as non-entertainment related things, like massive open online courses (MOOCs) and other educational resources.

But I do worry about the consequences. I'm glad that some of the major players are putting efforts into energy efficiency and carbon neutrality in order to conserve our natural resources and prevent unnecessarily huge emissions. We don't want the tools the Internet makes available, which we can use to make the world better through communication and education, to in turn destroy us. I like habitable climates more than entertainment. I'll swear to that, right after I finish playing Minecraft.

Related Articles

  • Abts, Dennis and Bob Felderman. "A Guided Tour of Data-Center Networking." Communications of the ACM. June 2012, Volume 55, Issue 6, Page 44-51. (October 4, 2013)
  • Bamford, James. "The NSA Is Building the Country's Biggest Spy Center (Watch What You Say)." Wired. March 15, 2012. (October 4, 2013) http://www.wired.com/threatlevel/2012/03/ff_nsadatacenter/all/
  • Bartels, Angela. "[INFOGRAPHIC] Data Center Evolution: 1960 to 2000." Rackspace. August 31, 2011. (October 5, 2013) http://www.rackspace.com/blog/datacenter-evolution-1960-to-2000/
  • Beaty, Donald L. "Data Center Alphabet Soup." Ashrae Journal. May 2013, Volume 55, Issue 5, Pages 88-91. (October 4, 2013)
  • Beaty, Donald L. "DNA of a Data Center." Ashrae Journal. June 2013, Volume 55, Issue 6, Pages 58-60. (October 4, 2013)
  • Beaty, Donald L. "More Data Center Dimensions." Ashrae Journal. September 2013, Volume 55, Issue 9, Pages 80-82. (October 4, 2013)
  • Berkes, Howard. "Amid Data Controversy, NSA Builds Its Biggest Data Farm." NPR. June 10, 2013. (October 4, 2013) http://www.npr.org/2013/06/10/190160772/amid-data-controversy-nsa-builds-its-biggest-data-farm
  • Chernicoff, David. "ARM in the datacenter gets another boost." ZDNet. February 14, 2013. (October 5, 2013) http://www.zdnet.com/arm-in-the-datacenter-gets-another-boost-7000011313/
  • Chernicoff, David. "RMS demonstrates the importance of the private cloud." ZDNet. September 20, 2013. (October 5, 2013) http://www.zdnet.com/rms-demonstrates-the-importance-of-the-private-cloud-7000020944/
  • Cisco. "Virtualized Multiservice Data Center (VMDC) 3.0 Design Guide." December 19, 2012. (October 19, 2013) http://www.cisco.com/en/US/docs/solutions/Enterprise/Data_Center/VMDC/3.0/DG/VMDC3_DG.pdf
  • Clark, Jeff. "ARM Versus Intel: Instant Replay of RISC Versus CISC." Data Center Journal. April 9, 2013. (October 5, 2013) http://www.datacenterjournal.com/it/arm-intel-instant-replay-risc-cisc/
  • Clark, Jack. "Facebook's first data center DRENCHED by ACTUAL CLOUD." Register. June 8, 2013. (October 5, 2013) http://www.theregister.co.uk/2013/06/08/facebook_cloud_versus_cloud/
  • Courtney, Martin, Chris Edwards, James Hayes and Philip Hunter. "Don't Blame the Data Centre." Engineering & Technology. August 2013, Volume 8, Issue 7, Pages 64-67. (October 4, 2013)
  • Daly, Jimmy. "The History of Federal Data Centers [#Infograhphic]." FedTech. May 16, 2013. (October 5, 2013) http://www.fedtechmagazine.com/article/2013/05/history-federal-data-centers-infographic
  • Data Science Series. "Digital universe will grow to 40ZB in 2020, with a 62% share for emerging markets." December 13, 2012. (October 20, 2013) http://datascienceseries.com/blog/digital-universe-will-grow-to-40zb-in-2020-with-a-62-share-for-emerging-markets
  • DiMinico, Chris. "Telecommunications Infrastructure Standard for Data Centers - ANSI/TIA-942." IEEE 802.3 HSSG. (October 19, 2013) http://www.ieee802.org/3/hssg/public/nov06/diminico_01_1106.pdf
  • EMC. "Digital Universe." (October 20, 2013) http://www.emc.com/leadership/programs/digital-universe.htm
  • EMC. "New Digital Universe Study Reveals Big Data Gap: Less Than 1% of World's Data is Analyzed; Less Than 20% is Protected." December 2012. (October 20, 2013) http://www.emc.com/about/news/press/2012/20121211-01.htm
  • Fehrenbacher, Katie. "NYT's data center power reports like taking a time machine back to 2006." Gigaom. September 24, 2012. (October 5, 2013) http://gigaom.com/2012/09/24/nyts-data-center-power-article-reports-from-a-time-machine-back-to-2006/
  • Glanz, James. "The Cloud Factories - Power, Pollution and the Internet." New York Times. September 22, 2012. (October 4, 2013) shttp://www.nytimes.com/2012/09/23/technology/data-centers-waste-vast-amounts-of-energy-belying-industry-image.html
  • Google. "Data center locations." (October 20, 2013) http://www.google.com/about/datacenters/inside/locations/
  • Google. "Data Centers." (October 4, 2013) http://www.google.com/about/datacenters/
  • Google. "Efficiency: How we do it." (October 16, 2013) http://www.google.com/about/datacenters/efficiency/internal/#servers
  • Google. "Explore a Google data center with Street View." (October 16, 2013) http://www.google.com/about/datacenters/inside/streetview/
  • Google. "Gallery." (October 16, 2013) http://www.google.com/about/datacenters/gallery/#/
  • Google. "Google container data center tour." April 7, 2009. (October 22, 2013) http://www.youtube.com/watch?v=zRwPSFpLX8I&feature=youtu.be
  • Hayslett, Michele. "Got Data?" Reference & User Services Quarterly. Spring 2007, Volume 46, Issue 3, Pages 20-22. (October 4, 2013)
  • Heath, Nick. "How Facebook ended up with baked potato inside its servers." ZDNet. September 19, 2013. (October 5, 2013) http://www.zdnet.com/how-facebook-ended-up-with-baked-potato-inside-its-servers-7000020896/
  • Higginbotham, Stacey. "AMD executive: The data center is changing and ARM will be the compute." Gigaom. June 19, 2013. (October 5, 2013) http://gigaom.com/2013/06/19/amd-executive-the-data-center-is-changing-and-arm-will-be-the-compute/
  • Hill, Kashmir. "Blueprints of NSA's Ridiculously Expensive Data Center In Utah Suggest It Holds Less Info Than Thought." July 24, 2013. (October 10, 2013) http://www.forbes.com/sites/kashmirhill/2013/07/24/blueprints-of-nsa-data-center-in-utah-suggest-its-storage-capacity-is-less-impressive-than-thought/
  • Kontzer, Tony. "Data Center Operators Flock To Cold Climates." Network Computing. September 30, 2013. (October 5, 2013) http://www.networkcomputing.com/next-generation-data-center/servers/data-center-operators-flock-to-cold-clim/240162015
  • Lesser, Adam. "4 types of data centers." Gigaom Pro. October 15, 2012. (October 5, 2013) http://pro.gigaom.com/blog/4-types-of-data-centers/
  • Levy, Steven. "Google Throws Open Doors to Its Top-Secret Data Center." Wired. October 17, 2012. (October 4, 2013) http://www.wired.com/wiredenterprise/2012/10/ff-inside-google-data-center/
  • Lohman, Tim. "Datacentres of the 21st century." ZDNet. September 30, 2013. (October 5, 2013) http://www.zdnet.com/datacentres-of-the-21st-century-7000021134/
  • Manca, Pete. "Software-Defined Data Centers: What's the Buzz All About?" Wired. May 29, 2013. (October 4, 2013) http://www.wired.com/insights/2013/05/software-defined-data-centers-whats-the-buzz-all-about/
  • Masanet, E., A. Shehabi, L. Ramakrishnan, J. Liang, X. Ma, B. Walker, V. Hendrix, and P. Mantha. "The Energy Efficiency Potential of Cloud-Based Software: A U.S. Case Study." Lawrence Berkeley National Laboratory. June 2013. (October 16, 2013) http://crd.lbl.gov/assets/pubs_presos/ACS/cloud_efficiency_study.pdf
  • Metz, Cade. "The Real Reason ARM Will Menace Intel in the Data Center." Wired. May 2, 2013. (October 5, 2013) http://www.wired.com/wiredenterprise/2013/05/hp-arm-memcached-chip-paper/
  • Mone, Gregory. "Redesigning the Data Center." Communications of the ACM. October 2012, Volume 55, Issue 10, Pages 14-16. (October 4, 2013)
  • Ohara, Dave. "Google opens up on seven years of its data center history." Gigaom. November 13, 2012. (October 4, 2013) http://gigaom.com/2012/11/13/google-opens-up-on-seven-years-of-its-data-center-history/
  • Open Compute Project. (October 22, 2013) http://www.opencompute.org/
  • Phneah, Ellyne. "Data volume to hit 1.8ZB in 2011." ZDNet. July 7, 2011. (October 20, 2013) http://www.zdnet.com/data-volume-to-hit-1-8zb-in-2011-2062301103/
  • Priest, Dana. "NSA growth fueled by need to target terrorists." Washington Post. July 21, 2013. (October 4, 2013) http://www.washingtonpost.com/world/national-security/nsa-growth-fueled-by-need-to-target-terrorists/2013/07/21/24c93cf4-f0b1-11e2-bed3-b9b6fe264871_story.html
  • Sloan, Jeff. "Data Center Dilemma." Ashrae Journal. March 2013, Volume 55, Issue 3, Pages 62-67. (October 4, 2013)
  • Uddin, Mueen, Azizah Abdul Rahman, Suhail Kazi and Raed Alsaquor. "Classification of Data Center to Maximize Energy Utilization and Save Total Cost of Ownership." International Review on Computers and Software. September 2012, Volume 7, N. 5. (October 4, 2013)
  • Volpe, Joseph. "eBay's new Utah data center goes green so you never have to stop bidding." Engadget. September 26, 2013. (October 5, 2013) http://www.engadget.com/2013/09/26/ebay-new-green-energy-utah-data-center-fuel-cells/

Advertisement

Loading...