Home » Power » Understanding Data Center Energy Consumption [2021]

Understanding Data Center Energy Consumption [2021]

Data Center Energy Consumption

Think of data centers as the brains of the internet—they process, store, and communicate all of the data behind the services we rely on every day, from social media to scientific computing. 

To do this, data centers use various IT devices, all of which require a significant amount of energy use. For example, servers perform computations and provide logic to respond to information requests, storage drives hold the data and files needed to meet each request, and network devices enable the incoming and outgoing flows of data by connecting the entire data center to the internet.

The electricity that these IT devices use ultimately turns into heat, which the data center must remove by using cooling equipment that also consumes energy.

Related: What Data Centers Do and Why

Electricity Use in Data Centers

On average, cooling systems and servers account for most of the energy consumption in data centers, followed by network devices and storage drives. 

Per the US Department of Energy, the largest data centers with tens of thousands of devices require over 100MW of power, which is enough to power approximately 80,000 households.

As internet usage increases, so does the demand for data center services, raising concerns

 about growing energy use. According to this study, from 2010 to 2018, global IP traffic increased by ten times, while data center storage capacity increased by 25 times. Over that same period, the total compute instances running on servers across the globe increased by more than six times.

We expect these growth trends to continue as the world keeps consuming more and more data. Moreover, with new forms of services like AI, which are computationally intensive, demand for IT services may grow even faster. That’s why it’s crucial to be able to quantify and project data center energy consumption.

Two Methods of Estimating Data Center Energy Consumption

Because official statistics are not compiled regarding national and global levels of data center energy consumption, we have to use mathematical models to estimate energy usage. “Bottom-up” models account for installed IT devices in data centers and each one’s energy use characteristics to estimate the total energy consumption. This approach is less common, as it’s very data and time-intensive—the most recent, authoritative study using this method appeared in 2011 and estimated that data centers were responsible for just over 1% of global electricity use in 2010.

On the other hand, we have extrapolation-based models to estimate total energy use by scaling bottom-up values based on market growth indicators like data center investments and global IP traffic. Because these models are much simpler, they typically get used to fill the gaps left behind by the more sporadic bottom-up studies.

Data Center Blockchain Concept

Bottom-Up and Extrapolation-Based Models: The Results

Because market indicators grow rapidly, extrapolation models estimate significant increases in data center energy consumption. For example, an often-cited extrapolation study suggests that global data center use has doubled since 2010 and will continue to rise swiftly. These estimates reinforced the common belief that increased demand for data leads to equally rapid growth in data center energy consumption.

However, results from the bottom-up perspective show otherwise. According to the study by Eric Masanet mentioned in the previous section, it’s more likely that global data center energy consumption only rose about 6% from 2010 to 2018. The number is based on the integration of recent datasets, which offer better characterization for installed stocks, operating characteristics, and the overall energy use of IT devices in data centers. These results lie in stark contrast to the extrapolation-based estimates.

There are three factors that primarily affect this change in energy use:

  1. IT device energy efficiency has improved substantially because of steady technological progress by manufacturers.
  1. Server virtualization software reduces the energy intensity of hosted applications by allowing multiple to run on one server.
  1. Cloud and hyper-scale class data centers utilize more efficient cooling systems to minimize their energy consumption.

Those three factors are hard to capture using an extrapolation-based approach, which is why those methods show more energy consumption than data centers actually use.

Data Center Energy Use and CO2 Emissions

Energy consumption in data centers has also given rise to concerns about CO2 emissions. However, it’s not yet realistic to estimate the total emissions due to a lack of global data center data and the emission intensities of their electricity sources. Companies like Apple, Google, and Switch are beginning to publicly report this data, though, which indicates a growing trend among the largest data center operators of moving toward renewable energy procurement.

But, since we know a lot about the global electricity use in data centers, we have a useful benchmark to test claims about CO2 emissions against. Let’s look at an example.

One often-repeated claim is that the global data centers emit the same amount of CO2 as the aviation industry—about 915 million tonnes. Global data centers recently consumed about 205 billion kWh. So, for this claim to be true, their average emission intensity would have to be approximately 4.4 kg CO2/kWh. Now, let’s look at coal-fired power plants, which are the most carbon-intensive facilities. Their emission intensity is less than ¼ of that value—about 1 kg CO2/kWh. Because data centers obviously don’t run on coal, especially in light of trends toward renewable energy use, it’s unrealistic to believe that the CO2 emissions from data centers are that significant.

Related: Understanding Data Center Infrastructure

Moving Forward: Enhancing Data Center Energy Consumption Efficiency

Solar Power Panels on Data Center

Because of the growing demand for IT services and compute-intensive applications, there’s a risk that the demand will outpace our energy efficiency gains. This means that investments in energy-efficient technologies like next-gen heat removal, computing, and storage will be a necessity to avoid steep energy consumption growth in the future. Analysts should consider a few key priorities:

  1. Developing and sharing reliable data on configurations, stocks, and energy use characteristics of cooling and power systems and IT devices.
  2. Sharing and comparing models to develop best practices to increase the confidence in model outputs.
  3. Working together to develop better practices for emerging trends like AI, 5G, and increased computing intensities.

Looking to enhance your data center’s energy efficiency? Get in touch with our data center power management experts today!

Leave a Comment

Your email address will not be published. Required fields are marked *

1 × 3 =

Scroll to Top