As the Internet of Things expands, with everything from thermostats to cameras to cars plugging into the `net, so does the need for machines that can handle those connections. The data traveling to and from all those thermostats, cameras, and cars, you see, must flow through the massive data centers operated by the likes of Google, Apple, and Facebook.
The worry is that powering all this extra hardware will require exponentially larger amounts of electricity—not to mention all the money and space spent on the hardware itself. But Urs Hölzle says this won’t be the problem it may seem. Hölzle is the man who oversees Google’s worldwide network of data centers, and he believes that efficiencies brought by devices such as Internet-connected thermostats, lighting systems, and self-driving cars will balance out the extra power needed to drive our computing centers.
“I’m pretty confident that the Internet of Things is going to have net negative power consumption,” Hölzle said during a briefing with reporters on Tuesday. “If you control lights, heat, and cooling in a smarter ways, that’s really substantial.” Even self-driving cars, he says, will push us towards lower power consumption. “You’ll have fewer cars on the road, fewer parking lots, less congestion, because every car is a potential carpool.” In other words, he believes we’ll use self-driving cars in much the same way we use Uber today, calling one whenever we need one.
His outlook may be optimistic. And it’s worth noting that Google makes both online thermostats and self-driving cars. But as Hölzle points out, so many Internet of Things devices—such as thermostats—move relatively little data over the network. And as time goes on, our data centers are juggling data with greater efficiency. Over the past decade, Google has streamlined its own operation, designing its own data centers, its own computer servers, and its own networking gear—and the results are enormous.
Compared with five years ago, Hölzle says, Google’s data center servers can now generate three and a half times more computing power with the same amount of money and the same amount of electrical power. These gains can be credited in part to Moore’s Law, the tech tenet that says the computing power in a computer chip will double every 18 months. But it’s also the result, he says, of far more efficient machines, cooling systems, and electrical systems.
Indeed, all of the leading internet companies—from Google to Facebook, Microsoft, and Apple—are pushing towards a far more efficient breed of data center, and with many of these companies sharing their practices with the world at large, many other operations are following their lead. With the rise of the Internet of Things, the demands placed on our data centers may be growing, but the technology needed to handle those demands is improving as well.
Hölzle acknowledges that his prediction comes with a caveat: the proliferation of online cameras—which send so much data across the network—may cause a steep rise in power consumption across the world’s data centers. “Video is the one exception,” he said on Tuesday. And Jason Mars, a professor of computer science at the University of Michigan who specializes in data center technologies, knocks Hölzle’s thesis down another peg.
Internet of Things devices may improve power efficiency in theory, Mars says, but it’s unclear how well they will actually work—or whether we humans will use them as a effectively as we should. “I’m not going to hold my breath on how effective future products will be at reducing the power consumption of society to the point of being ‘net negative’ relative to data centers,” he says.
But Mars also says that, thanks to likes of Google, data center efficiencies are improving—and that there’s room for even greater improvements in the future. Google is leading this revolution, he says, followed by companies like Facebook and Microsoft. And he has more insight than most, having worked along Google data center engineers to analyze the performance of its data centers.
Efficiency through Openness
Facebook’s role is also particularly notable. For years, Google kept its data center technologies under wraps, seeing them as a competitive advantage. But in 2011, through its Open Compute Project, or OCP, Facebook open-sourced many of its streamlined data center designs, sharing them with the world at large. This has sparked a dramatic shift in the way the industry builds hardware. “OCP is a great idea,” Hölzle acknowledged. And though he didn’t say so, one big result of the project is that Google has opened up its practices as well.
In recent years, Google started using machine learning algorithms to analyze and refine the operation of its data centers, improving efficiency in some facilities by 15 to 25 percent. About a year ago, Google published a paper showing others how this could be done. Google isn’t sharing its particular algorithms with others, but according to head of data center operations Joe Kava, its paper has directly sparked similar efforts from outside companies.
Still, there are limits to how much Google is willing to share. Unlike Apple and Microsoft, Google has not joined Facebook’s Open Compute Project. “Our applications,” Hölzle says, “aren’t identical to everyone else’s.” And Mars says the company still closely guards what goes on inside its data centers. “Google likes to enjoy their datacenter innovations that result in reducing the cost per query for a year our two before they are willing to publish their designs in academic communities,” he says.
But Google is sharing more than it has in the past. And like Facebook and Microsoft, the company is leading by example. As a result, the demands of the Internet of Things don’t look nearly as daunting.