Datacentre Energy Crisis - Are We Doomed?

Datacentre Energy Crisis - Are We Doomed?

We cannot deny that data centres are the driving force behind our digital life today. But it would be foolish of us to turn our back on this issue and how it is going to impact our future of energy availability. Data centres globally are consuming an enormous amount of energy to sustain our digital lifestyle and it is only going to get worse. 

What Is a Data Centre?

Data centre (British English) or Data center (American English) is a dedicated facility where computers and storage systems are stored, all connected through networking components. The facility also includes environmental systems such as cooling, monitoring, security and fire suppression systems to keep data safe. All data centres deploy some form of redundancy within the facility, such as redundant computing resources through clustering, backup power through generators and multiple utility power connections. 

Function Of A Data Centre

It is easy to understand what a data centre is. But what do they do really? Think about your Facebook account for example. You just shared your last vacation pictures with your friends and families. Those pictures and texts are off to a big box facility known as Facebook Data Centre at super speed. Once there, your data makes a permanent living in one or several of the thousands of servers and even more hard drives. Every time someone sees your content, they are actually retrieving it from one of these giant data centres.

But this is just one of your many online accounts. What about the other important parts of your digital life such as Instagram, Twitter, Pinterest, YouTube, WordPress, Online Baking, Virtual Server, Cloud Storage, Accounting, etc. You guessed it, all of your digital service providers got some form of data centres somewhere in the world which serves your content on-demand. In a way, each data centre is a storehouse of digital knowledge.

Feeding The Beast

It's no surprise that without electricity, data centres are really useless. No power, no servers running, no hard drives spinning. If that were to happen, imagine next time you go to post some pictures of your amazing dinner at your favourite restaurant, you may hit a brick wall with a “Page Not Found” error. Because Facebook service would simply won't exist anymore. Simply put, data centres are the biggest consumer of electricity today. It takes a tremendous amount of power to keep those servers and all other supporting components running in a data centre. Hard to believe? Let’s look at some numbers.

The measurement of electricity consumption is done by kWh. A kiloWatt hour or kWh is a measurement unit of how much power is being used over a certain period of time. The utility company uses this unit to charge its customers per month. The formula to calculate kWh of a device is:

( device watts * number of hour per day ) / 1000 = kWh

Device watts is the power in Watts of the device. Almost every electrical device has this information on their description label on the device itself. As long as we know the Wattage value and how many hours that device operates per day, we can easily calculate the kWh value of the device from the previous formula.

A typical server consumes between 600 to 1200 watt of power. Let’s say on average a server uses 900 watts of electricity per hour. If the server was running 24/7, the monthly usage of the server will be 648 kWh based on the formula:

( 900 * 24 ) / 1000 = 21.6 kWh * 30 = 648 kWh

So, a data centre with 30,000 servers will gobble a whopping (30000 * 648 ) = 19,440,000 kWh each month! This does not even include all other supporting components that keep a data centre running such as the cooling system, networking components, etc.

To put it in perspective, a typical North American household uses about 950 kWh per month. Check out your electric bill and see how much kWh of power your household uses a month. The amount of power a 30,000 server data centre consumes can run over 20,000 houses per month. The output of an electrical power plant, on the other hand, is measured by MegaWatt or MW.

Watt / 1000 / 1000 = MW

This really means the power plant is able to supply X MW amount of power continuously. So using our previous example of 30,000 server data centre and the formula for Megawatt, we can easily find out how much of output needs to be dedicated to the data centre to keep it running at all times:

( 30,000 * 900 ) / 1000 / 1000 = 27 MW

To keep our example data centre running, the power plant needs to supply 27 MW of power at all times. These figures are actually very conservative as most data centre will hold a lot more than just 30,000 servers. There are about 450 large scale data centres around the world and many hundred smaller scale ones. As of this writing, the largest data centre, “The Switch” being built at Tahoe Reno, Nevada will use 850 MW of power when fully occupied. This is feeding the giant in massive scale indeed.

Global Internet Revolution

There are a few things that are revolutionizing the Internet as we know it today. They are also the catalyst of accelerated data centre growth that will, in turn, increase the consumption of energy.

Internet-connected Devices

The recent explosive growth of Internet-connected mobile devices is going to test the limit of our beloved Internet. In the blink of an eye, we can send our contents anywhere in the world through the Internet. In the last 30 years, the amount of traffic that traverses through the Internet has increased over the years. Just take a look at the following graph:

blog 2019 wahmed datacentre energy crisis internet traffic growth chartIn the year 1987, only 2 TeraBytes of data were traversing through the internet. That is even smaller than a typical desktop hard drive we have today. Just after 10 years that traffic grew to 60 PetaBytes or 60,000 TeraBytes. In the year 2007, it swelled to 50 ExaBytes or 50,000,000 TeraBytes. Finally, in the year 2017, it passed 1 ZetaByte or 1,000,000,000 TeraBytes. Based on this growth trend, it is not hard to predict that by the year 2027 the growth can surpass 1 YottaByte. 

We can only imagine where it is going to be in another decade. We are taking billions of pictures a day, posting millions of videos and dumping everything on the so-called Cloud. No, we are not talking about the clouds that float around through our sky. Going to the digital cloud has become a norm today. All cloud services are really the sum of all data centres that make up the Internet globally.

Besides the mobile devices, IoT or Internet of Things is going to be the second largest consumer of the Internet. Simply put, IoT devices are everyday objects, sensors or other types of embedded electronic devices with Internet ability. They can talk to each other, collect data, perform monitoring tasks all through the internet. Smart homes are a prime candidate for the mass IoT implementation. A typical house can have at the very least 15 IoT devices. Globally there are more than 10 billion IoT devices already. That number is set to double by the year 2021. Infrastructure to support IoT devices are already taking place and it will soon become a mass household buzz.

Plugging In Developing Countries

The birth and continuous progression of the Internet has brought the world closer than ever before. More and more developing countries are getting plugged into the Internet as mobile devices are getting cheaper while those countries upgrade their telecommunications infrastructure. A report published by the International Telecommunication Union shows that 43.9% of the world’s population is still without Internet access. This percentage has already started to shrink at a much faster rate than expected due to developing countries taking steps making the Internet available to their citizens. 

Digital Currency Adoption

Although all the hype of Blockchain and Cryptocurrency may have died down, crypto is here to stay. Among 1,000+ various types of crypto coin available right now, Bitcoin is still considered the king of all crypto coin. There are several ways to generate these digital currencies. Mining is one of them. A bitcoin mining machine requires a tremendous amount of power to generate bitcoin. During the hype when many people started to mine coin for profit, the unprecedented energy consumption spike caught everyone by surprise in the energy sector.

To support many thousands of crypto mining operations, massive data centres have been built or are in the process of being built. If the crypto industry continues to grow, we can expect the energy crisis to come knocking at our door much sooner than anticipated.

Cool Me Down

No data centre talk is complete without the topic of cooling. About 40% of the energy of a data centre is consumed by some form of cooling system to cool down thousands of servers. Heat is a by-product of almost all electrical devices, servers are no different. Excessive heat actually robs the life expectancy of server hardware and reduces performance. It is true, newer servers are built more efficiently and they do produce much less heat. But they do produce it nonetheless.

Most data centres are still using good old air to cool their servers. An HVAC system gobbles a large quantity of energy just to keep itself running. Usually cooler air is pushed through the bottom of the floor in front of the server racks and the server sucks in the colder air to cool itself. Hot air comes through the back of the server as exhaust where it gets pulled out of the rack enclosures. This air-based cooling system cools not just the critical components but basically everything. Do we really need to cool down the rack frame itself?

The Solution

Now that we have a wide picture of the energy issue, we are now responsible to do something about it. It will be wrong to say that no one is doing anything to solve this energy crisis. New innovations are taking place, upgrades are being made, data centre builders are getting educated and governments have started to talk about it. We may not have a full bullet-proof solution, but there are few things we can do to minimize the impact of data centre energy crisis for the future.

Liquid Cooling

As mentioned already, the old air-based cooling system is no longer viable for the future of the data centre. Liquid cooling may just provide the solution to cool servers all the while cutting down energy consumption significantly. The biggest advantage of liquid cooling is that it can be targeted to a specific area inside the server that needs cooling, CPU and memory for example. On top of that, the liquid substance has a much better heat transfer ratio than air which makes it extremely efficient.

Liquid cooling is really not a new technology. It has been used in the computer industry for a while now. I am sure we all heard of or have seen those liquid cooled gaming rigs. Cooling fluid is carried to the most heat generating component using a tube system and then hot fluid is pumped to a radiator to cool down. Special block made out of alloy metal is used instead of the standard aluminum heat sink and fan which is what transfers the heat to the fluid. The pump continues to cycle the fluid throughout the system thus cooling components.

There are several options available when it comes to cooling fluid. Water is the best option due to the great heat transfer ratio. But it is also problematic should there be a leak in the cooling system. Water is a highly conductive fluid and can easily short out electronic components. Dielectric fluid such as Mineral Oil is the next best option to be used as cooling fluid.

White mineral oil is a non-conductive fluid with really good heat transfer ability. Other than getting a little messy, there is nothing to worry about if there is a leak. Due to the non-conductive property of mineral oil, it is also possible to submerge the entire server into a tank of mineral oil for cooling. The mineral oil can be pumped out of the tank, cooled with a radiator or evaporative cooling tower then brought back in the tank. Green Revolution Cooling seems to have perfected this type of immersion-based cooling system.

Novec Engineered Fluid from 3M seems to be another choice of fluid for immersion based cooling system. In this system, servers are also submerged in the Novec fluid. The difference is when the fluid comes in contact with high heat, it turns into a gas and rises in the tank. When the gas hits the cold cooling coils in the top of the tank which are cooled using water, they transform back into liquid droplets and fall back into the tank. The biggest benefit of using this fluid is even though the fluid comes in direct contact with components, it leaves absolutely no residue. When a submerged server is pulled out of a Novec filled tank, it appears as it was never submerged.

Regardless of which fluid or type of liquid cooling system is used, they all share the benefit of going with liquid cooling. Servers run much cooler and trouble-free for a longer period of time. Especially in a submerged system, which cuts down dust and oxygen availability, servers can have a minimum of 2x more life expectancy. There is no need for fans in a liquid cooling system. So a data centre can run with a pin drop silence.

Liquid cooling also increases server density per rack significantly which reduces the size of the data centre required for operations. It is extremely difficult if not impossible to achieve maximum rack density using traditional air cooling system. 

Renewable Energy

The energy that is derived or collected from natural sources is known as Renewable Energy. Few examples of this energy are Solar, Wind, Tidal Wave, etc. The talk of renewable energy is as old as the topic of global warming. Some Politicians, Environmentalists and Activists have made it a priority to push it forward. But it would appear even with technological breakthroughs, we are still far behind when it comes to alternate energy. Call it oil conspiracy or government turning a blind eye, the fact remains that going all out renewable energy usage is not happening anytime soon.

Progress is taking place for sure. People are a lot more open to discussing and switching to renewable energy. More and more businesses are also moving towards some form of renewable energy to run their businesses. Energy manufacturers are also slowly but surely building more solar and wind farms and offering their customers' alternate options.

The biggest obstacle when it comes to alternative energy is the initial cost. Although in the long run, the system can pay for itself, the initial investment to deploy a fully sustainable system still comes with a high price tag. With all the advancement in technology we had in the last several years, it is hard to believe the price of alternative energy has not gone down proportionally.

If data centre owners do not start moving towards renewable energy to power the cloud, we will indeed be doomed in the coming future. Newer energy efficient servers mixed with lower energy requirement due to liquid cooling is a perfect combination to start implementing mass use of renewable energy in data centres. The Switch at Nevada is projected to run on 100% renewable energy, all 850 MW of it.

Hyperscale Data Centre

In laymen term, Hyperscale Data Centres or HDCs are new breeds of data centres able to scale anytime. HDCs are not focused on the size of the facility but rather the resources being used by tenants. For example, a standard data centre will think about how much rack space a tenant is using. Whereas a hyperscale data centre operator will count how many kilowatts of power the tenants are using. In this scenario, the tenant may be using 3 racks of server equipment but they will be charged by the amount of electricity usage and not the rack count.

Each rack in an HDC has a certain amount of electricity allocated. For example, the Switch in Tahoe Reno, Nevada we mentioned earlier will have a 55 kW allocated per rack. Everything in HDC is based on a modular design. This is what allows the ease of scalability at a massive scale. It is also cost and energy efficient to run one massive data centre instead of many smaller ones. Automation is heavily leveraged in a hyperscale data centre. Surprisingly, an HDC employs more on-site security staff than technical personnel. Any part of the data centre can be monitored and managed remotely. Most HDCs are colocation and allowing tenants to host their IT equipment while reducing the overhead of running their own data centre.


So, are we doomed? Only if we do not start doing something about it. There are several options on the table right now to reduce data centre energy footprint on our planet. Some solution requires large initial investment while some just need a forward-thinking decision maker to act on low hanging fruits. 

We already have so much problem in this world with hunger, poverty, political conflicts to name a few. A crisis like energy certainly is not going to make it any easier. The human being is an extremely resourceful creature, a marvel of creation. Someone somewhere will come up with an ingenious idea that will solve the data centre energy crisis for a very long time. As long as we are talking about the issue and taking steps in the right direction, we can rise above this potential crisis.