Wednesday, January 16, 2013

Google Throws Open Doors to Its Top-Secret Data Center

In my attempt to catch up on my readings over the Holidays which includes Wired Magazine, I cam across this piece about Google data centers.  In an unprecedented manor, Google granted access to Wire writer Steven Levy who was able to tour a few facilities including the data center in Lenoir, NC which is close to where I used to reside.

Typically you would think of these giant centers as the ultimate consumers of energy.  While they do consume a lot, Google has been able to re-engineer these data centers to use less while producing additional capacity.  These data centers definitively help to provide Google with a competitive advantage.

The full Wired article can be viewed here.  The video below also gives you good insight as well
  • This is what makes Google Google: its physical network, its thousands of fiber miles, and those many thousands of servers that, in aggregate, add up to the mother of all clouds. This multibillion-dollar infrastructure allows the company to index 20 billion web pages a day. To handle more than 3 billion daily search queries. To conduct millions of ad auctions in real time. To offer free email storage to 425 million Gmail users. To zip millions of YouTube videos to users every day. To deliver search results before the user has finished typing the query. In the near future, when Google releases the wearable computing platform called Glass, this infrastructure will power its visual search results.
  • Hölzle and his team designed the $600 million facility in light of a radical insight: Server rooms did not have to be kept so cold. The machines throw off prodigious amounts of heat. Traditionally, data centers cool them off with giant computer room air conditioners, or CRACs, typically jammed under raised floors and cranked up to arctic levels. That requires massive amounts of energy; data centers consume up to 1.5 percent of all the electricity in the world.
  • Google’s breakthroughs extend well beyond energy. Indeed, while Google is still thought of as an Internet company, it has also grown into one of the world’s largest hardware manufacturers, thanks to the fact that it builds much of its own equipment. In 1999, Hölzle bought parts for 2,000 stripped-down “breadboards” from “three guys who had an electronics shop.” By going homebrew and eliminating unneeded components, Google built a batch of servers for about $1,500 apiece, instead of the then-standard $5,000. Hölzle, Page, and a third engineer designed the rigs themselves. “It wasn’t really ‘designed,’” Hölzle says, gesturing with air quotes.
  • All of these innovations helped Google achieve unprecedented energy savings. The standard measurement of data center efficiency is called power usage effectiveness, or PUE. A perfect number is 1.0, meaning all the power drawn by the facility is put to use. Experts considered 2.0—indicating half the power is wasted—to be a reasonable number for a data center. Google was getting an unprecedented 1.2.

No comments:

Post a Comment