The system can be set up to reroute traffic in the case that servers or network equipment fail in one area. Traffic can also be load balanced by distributing work evenly over the network and servers to prevent congestion and bottlenecks. Things like data backups, system redundancy and adequate battery backups can also make life easier when outages do occur. Google stores every chunk of data on two or more servers, and really important data is backed up to digital tape. Data centers often have service from multiple Internet service providers (ISPs) for added load sharing and redundancy. If a company has multiple data centers, traffic can even be routed to another facility entirely in the event of complete disaster.
To keep things running smoothly and stay up with current technology, equipment and software need to be upgraded and replaced regularly. Older systems also have to be supported until they are replaced, which hopefully happens well before they are obsolete. The data center needs an infrastructure that makes replacing old equipment and adopting new technology as easy as possible.
Data centers often deal with lots of sensitive or proprietary information, so the sites have to be both physically and digitally secure. They might have gates, security doors, alarms and security staff. Some companies are even loath to disclose the locations of their data centers, as well as any equipment and design features that might be trade secrets. When hard drives fail and have to be disposed of, they might be both erased and physically destroyed so that data doesn't fall into the wrong hands. Networks require security such as firewalls and other methods to keep electronic intruders/hackers out.
Data centers also need emergency equipment like fire alarms, sprinklers or other fire suppression systems to protect people and equipment. The servers, fans and other devices generate a lot of noise, requiring ear protection, and a lot of heat, requiring other employee and equipment safety measures.