Energy & Environment

How Bit Met Watt

Santiago Miret

With the miniaturization of electronics, the world’s computing capabilities exploded, while the energy required to produce bits of data continued to drop. The world was creating more and bits of data, which required less and less watts to operate. As the computing devices became smaller, we went from room-sized computer used for the Apollo program to the smartphones and tablets of today.

The computer’s of the Apollo program could store only a couple of megabytes of data, which is less than a picture taken on today’s smartphones. Moreover, the energy required to power a giant computer machine was significantly higher than the  5.45 Wh battery used by today’s iPhone’s. The “Moore’s Law” equivalent for energy efficiency of computing is also quite impressive: The energy efficiency of computing increases ~100x every decade and doubles about every 1.5 years. Yet, as the devices grew smaller and smaller, the computing infrastructure required to operate those devices exploded quietly in the shadows.

The advent of smartphones, tablets, the digitalization of information, and the cloud catalyzed an unprecedented growth in computing requirements, leading to the creation of immense data centers. While it takes only 2-3 kWh a year to operate an iPhone, while it can use ~60 kWh a year for data consumption.  The continued installation of data centers increases the energy footprint of computing, with data centers currently consuming 1.7% of global power supply and ~2% of power in the United States. While smartgrids and big data analytics continue to spread in the energy industry, the relationship between bits and watts will become even more important. Fortunately for data centers, the solution to reducing the energy footprint of computing was hidden in plain sight: data analytics.

The key metric for data-center energy efficiency is the Power Utilization Effectiveness (PUE).  The PUE measures how much of the energy coming into the data center is actually used for computing. An ideal PUE is 1.0, as in that case all of the energy input is used for computing. A PUE of 2.0 indicates that 1 Watt of additional energy is required to maintain the data center infrastructure for each watt used for actual computing. A recent study by Stanford Professor Jonathan Koomey reveals where data centers stand on energy efficiency. Koomey’s research showed that the distribution of PUEs for typical data centers is between 1.83 and 1.92.

The following graph shows the energy distribution for a typical data center:


Data Center Energy Use – Source: Rocky Mountain Institute

The figure shows that ~65% of the energy going into a typical data center is wasted. The major consumer of energy of energy in a data center is the cooling system, and therefore designing more efficient cooling systems will have the largest impact in the energy distribution.

Knowing this energy consumption pattern, the Rocky Mountain Institute (RMI) investigated methods to induce energy savings in data center cooling. One significant, yet simple, insight was to raise the operating temperature of data centers. Many data centers are cooled to unnecessarily low temperatures without performance gain, leading to energy losses. An increased temperature range, gained by raising the operating temperature, allows for more flexibility in designing a new temperature and humidity control systems for the data center.

As shown in the case study of the Colorado National Snow and Ice Data Center, redesigning the cooling system can lead to 90% energy savings, which translates to 40% of overall energy saving in typical data centers.

To show what is realistically possible for data-center efficiency, it is worth looking into the PUE of one of the world’s largest data producers and consumers, Google. Google’s ultra-efficient data centers, with a PUE of 1.12, are beating the PUE curve by miles. Google achieved its PUE by optimizing the energy design of its data centers from all angles, including cooling, electronic losses as well as advanced computing techniques to improve server power use. Here is a graph outlining Google’s data center infrastructure:

Google’s Data Center Energy Infrastructure – Source: Google Inc

Cross-posted from BERC Blog, published online by the Berkeley Energy & Resources Exchange, a network of UC Berkeley scholars and industry professionals.

Bookmark and Share
Comment to "How Bit Met Watt":
    • Vivek Nerlikar

      Hi, Santiago: A wonderful article. For starts, the connection between bit and watt actually explains for the layman how energy consumption per person is increasing.

      I manage a data center and can tell you managing PUE like Google is almost impossible, especially if you are located in the Tropical belt. As indicated in Koomey’s research PUE ~ 1.9 is more realistic.

      [Report abuse]

Leave a comment



You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

1 × 5 =