Behind the cloud
The server farms that store and process our online data are growing at a rapid clip.
What exactly is the cloud?
In the beginning of the digital era, computing was local: To save or retrieve a document, you stored it on your computer’s hard drive or on a physical disk. Now, your files can be saved to “the cloud”—the internet, essentially—so that photos, documents, and videos can be fetched from anywhere, no matter what device you are using. When you draft an email at home and send it later from your work computer, you are using the cloud. The same goes for when you stream movies or songs, or access months-old photos on your phone. The advantage of this kind of computing is that you never have to worry about running out of storage space on your own device. It also offers speed and flexibility, by shifting the burden of processing information to powerful computers located elsewhere.
Where are these computers?
Though the cloud evokes floating white puffs, our data is actually very much earthbound, stored in enormous warehouses containing rows upon rows of servers. These data centers, some of which are as big as football fields, require incredible amounts of energy to operate—and to cool, since the servers throw off huge amounts of heat. As a result, the warehouses must constantly run superpowered air-conditioning systems to keep the computers from overheating. All of that energy adds up: U.S. data centers consumed about 70 billion kilowatt-hours of electricity in 2014, representing 2 percent of the country’s total energy consumption. That’s equivalent to the power consumed by about 6.4 million average American homes. “It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who has helped design hundreds of data centers.
Where are these centers located?
There are thousands of data centers all over the world, though the powerhouses of the $250 billion cloud industry—companies such as Amazon, Microsoft, and Google, which rent out computing power to companies around the globe—run a few hundred truly massive centers themselves. These centers, which can contain tens of thousands of servers each, account for a majority of cloud storage and traffic. Historically, companies have placed these massive data farms near tech-intensive areas, such as Silicon Valley and the suburbs surrounding Washington, D.C. But given growing energy and space demands, many cloud firms are seeking out cheap, undeveloped land where energy costs are more affordable. Google, for instance, is building facilities in Alabama and Tennessee, and Apple is building a $1.3 billion complex in Waukee, Iowa, with two data centers totaling 400,000 square feet.
So these data centers are expanding fast?
Incredibly so. The soaring demand for streaming video, the growth of the “internet of things”—including connected cars, refrigerators, and health devices—and the rise of smart assistants such as Amazon’s Alexa and Apple’s Siri are driving breakneck demand for new data centers. Last year, Google opened cloud data centers in Mumbai, Singapore, São Paulo, and Frankfurt, Germany, and Amazon has recently announced plans to open or expand “hyperscale” centers in Ohio, Virginia, and Ningxia, China. Oracle is building 12 giant data centers, a huge leap beyond the three it currently operates, over the next two years, in the U.S., China, India, and Saudi Arabia. Some estimates project data-center electricity demand could reach 13 percent of global electricity consumption by 2030.
What does this mean for the environment?
To reduce the burden of carbon emissions, cloud companies have begun taking steps to “green” their operations. Microsoft uses wind power and biogas to run data centers in Ireland and Wyoming, respectively, while Apple and Google either use 100 percent renewable energy to power their data centers, or buy enough renewables to offset what they use. Cloud operators have also become far more efficient at managing their power needs, designing servers that automatically switch to low-power mode when not in use. Firms are also expanding to cold-weather locales—Facebook operates an enormous data center in Sweden just 70 miles south of the Arctic Circle—because the frosty environment eliminates the need to cool the servers. Microsoft is even experimenting with placing data centers on the ocean floor, where cold water would keep servers cool, and proximity to undersea fiber optic cables could speed services.
The need for speed
One major challenge for cloud operators is known as latency, or the delay involved in sending and receiving data between users’ devices and the cloud. No matter how rapidly a computer processes data, that data still has to physically travel between off-site computers and your own device, and data centers located farther away can add milliseconds to a user’s experience. For now, a difference of a few milliseconds might not matter much when you are streaming a movie or downloading a photo, but it will matter more as numerous connected cars and health-care devices come online. The solution may be smaller, localized data centers seeded throughout the country, which will be able to offer speed through proximity. So instead of 100 milliseconds, your self-driving car will only have to wait 10 milliseconds before it decides to speed up or brake—welcome news when every millisecond will count.