Making the right choices for data centre cooling

8fa95d68-e8a7-47f2-abae-534ab61a0a9a

29 August 2024
|

By Richard Creber, Regional Head of Data Centre Operations, Pulsant.

 

It’s hard to overstate modern society’s reliance upon data centres. The remote working revolution, the international supply chain, and literally the entire internet all depend upon a healthy network of data centres to underpin them.

 

Of course, both this reliance and the sheer volume of use makes data centres a huge producer of emissions. The IEA has estimated that 1-1.5% of global electricity goes to data centre usage alone, with the UK using around 2.5% of its electricity output on data centres according to the National Grid ESO – with 90% of that potentially being wasted as equipment idles while awaiting command.

 

That makes data centres an incredibly important target for sustainability measures. A data centre’s performance is quantified through its Power Usage Effectiveness (PUE) ratings – essentially a calculation between the total amount of power a data centre uses, divided by the amount of IT power it uses.

 

The ‘holy grail’ of sustainability is a PUE of 1.0; if a data centre had 0 energy needed for anything except IT, it would be as lean as possible, and therefore wasting the least energy.

 

The question thus becomes how close to that perfect score a data centre can get. The Climate Neutral Data Centre Pact aims to deliver a PUE of 1.4, while some countries are more stringent; the Germany Energy Efficiency Act, for example, will require a PUE of 1.2 for new data centres from 2026.

 

Given that 40% of total energy requirements for a data centre are generated by cooling, according to Deloitte, a reduced energy output requires a cooler data centre. Temperature is fundamental to both keeping machinery operating at optimal temperatures, and in protecting the wider environment from all the heat that such an energy-intensive operation produces.

 

It’s at this point that we turn to the importance of air containment – a fundamental part of treating air in the data centre, both in terms of maximising the efficiency of the air being used to cool, and in avoiding the recirculation of hot air.

 

Richard Creber

Richard Creber

 

Blowing hot and cold

 

Just as it sounds, air containment in a data centre means isolating air between the inlets and outlets of a server rack. Ensuring that hot and cold air doesn’t mix in the data centre is extremely important – you can imagine the reaction a client might have if you’re accidentally firing warm air into the front of their expensive server rack!

 

There are two approaches to this:

 

  1. Hot air containment. This method involves funnelling hot air from the racks into cooling heat exchangers, so that a much larger volume of cold air can be drawn in by the rack at its own pace. The hot air travels through purpose-made trunking, before being cooled using ‘free cooling’ to its fullest extent, rather than employing more energy-demanding methods.

 

  1. Cold air containment. This approach involves the trunking of cold air from those same cooling heat exchangers, it into the inlet side of the server rack or data cabinet. The equipment is thus isolated from hot air being exhausted from the rack, with that heat being funnelled away.

 

Generally speaking, hot air containment is considered the more effective option. If hot air can be contained, the remainder of the space around the racks and servers provides an abundance of conditioned air for those racks to pull in as required, protecting them against any localised ‘starvation’ of cool air.

 

But that’s in a perfect world – and data centre management is very often about working with what you have, rather than what you would like.

 

So, what are the factors we need to consider when choosing how to cool our data centres?

 

Raise the roof?

 

Content continues after advertisements

Well, for one, most data centres are in buildings that already exist. When you think about the number of buildings that will be in use between now and 2050 – when most net zero targets are supposed to have been met – the vast majority have already been built. Given how carbon-intensive construction can be, there’s a pragmatic and moral obligation to see how far we can stretch what we’ve got.

 

Depending on the circumstances, that can necessitate hot or cold containment purely based on the structure of a building itself.

 

For example, it was standard practice for decades to have a raised floor in data centres. They were included to accommodate cables rather than air distribution, which is now common place to provide conditioned air to the inlet side of the server/racks. An accident of infrastructure suddenly looks purpose-built for cold aisle containment, representing both a cost-saving opportunity, and a means of sidestepping intensive construction.

 

On the other hand, a raised floor inevitably means that it can support less weight. Large companies can expect to simply wheel in an entirely pre-populated server rack that weighs in excess of a ton straight off the truck. A hot aisle containment data centre with a floor of solid concrete would have no problem – but if the loadbearing weight of a server room with a raised floor is maxed out, the limitations of a cold containment strategy could be incredibly expensive!  

 

So the preference/choice between hot or cold aisle containment is often dictated by construction constraints. From there, it’s about finding ways to tighten up the chosen method in order to deliver the most effective results.

 

Consider all the variables

 

There are plenty of variables in that process. While the building itself might necessitate some choices, data centre providers are much more likely to have agency over the equipment they choose to use.

 

Fans are imperative to hot and cold air containment, but modern solutions bring more control and versatility – for instance, fans that can increase or decrease depending on the temperature of the equipment they’re cooling, balancing the need to cool this equipment, right now, with the wider need to save energy.

 

That’s not just true of the equipment around the racks being cooled, but also right across the facility. Imagine a 1,000sqm hall – and then fill it with strip lights every 4 feet or so. Those 800 strip lights, all of which are radiating heat onto a set spot and burning energy to boot, will have an impact on how cool and how sustainable a data centre can be. Deploying passive infrared sensor (PIR sensor) or the more recent and more effective, microwave-controlled lights in these spots, so they’re only on when they detect movement within a certain timeframe, could drastically reign in both the overall energy cost and the heat from each individual light.

 

The same applies to the materials and the fuels that data centres use. Insulation is the name of the game when it comes to data centre design, so operators are looking for sustainable materials that also achieve the level of insulation required. AWS is using lower-carbon steel to construct new data centres, while here at Pulsant, we’ve committed to transitioning to Hydrotreated Vegetable Oil (HVO) and phasing out diesel. These can be small changes – but they’re also a corporate responsibility we must own.

 

It’s equally important to consider the actual location of the data centre, too, and what that means for cooling considerations. A data centre in Iceland, for example, is blessed with both cold ambient temperatures and access to a real wealth of renewable energy, particularly when it comes to hydropower. Compare that to data centres in Southern Europe, where climate change is pushing temperatures beyond 40°C with alarming regularity – and data centres not designed to function in that climate might need greater volumes of less environmentally friendly energy to function.

 

Hotter periods and more frequent winds and storms present increased demand on building fabric, and network and power resiliency, so cooling systems must be designed to avoid regularly falling back on emergency measures.

 

Lower the impact

 

Individual site action plans to achieve required improvements are crucial. These should include work required to infrastructure as well as strengthening site team skills and processes. People are powerful in embedding sustainability into the way sites operate. While introducing Cooling Procurement Standards can be an effective way to ensure early adoption of low global warming potential (GWP) systems when appropriate and available.

 

The gist of all this, in simple terms, is that data centre operators have to take a huge amount into account when choosing between hot and cold air containment. From the fuel of choice to the country itself, many of the deciding factors are often out of their immediate control.

 

That reflects the challenge we face as an industry when it comes to sustainability. With data centres only becoming more important as we move forward, we have to be able to maximise the effectiveness of what’s available to us – both in the data centre, and with the planet at large.

 

https://www.pulsant.com/