I Know What You Did Last Summer
Has Thames Water’s latest announcement on water restrictions sent a chill down the spine of London data centre operators?
A recent announcement by water utility Thames Water has potentially given London data centre operators a headache, with serious consideration being made to restricting their use of water at peak times.
Wet weather in the UK in July has done little to dampen the discourse around data centre water use, with extreme heat waves ravaging much of Europe and droughts prevalent in the US. Water is a resource under increasing stress, with global fresh water demand set to outstrip supply by 40% by 2030.
Thames Water are concerned that any UK heatwaves could see a repeat of last summer, where data centres across the capital were forced to supplement their cooling systems with additional water spray to keep equipment operational. Depending on cooling methods, large data centres could potentially use between 1 million and 5 million gallons of water a day (between 3.78 million and 18.92 million US liters), according to estimates.
All this comes as the European Commission, from March 2024, will require operators to report wide-ranging data about their energy and water use to the public, putting increasing pressure on data centres to think carefully about their cooling systems.
So what can operators tap into in order to stay ahead of increasing water pressure? Airedale’s Global Product Manager for chillers, Patrick Cotton, has three pieces of advice:
1. Increase the Design Ambient Temperature of Chillers
Chillers are the workhorse of any data centre cooling system, operating outdoors all year round. For much of the last ten years, chillers have been designed to 35°C ambient temperatures, meaning they are designed to operate in outdoor temperatures up to 35°C. Unfortunately, this design condition is becoming out of date, as summer peaks beyond this threshold become more commonplace. Data centre operators are left with no option but to try and reduce the temperatures chillers are seeing at the condenser coils with additional water spray.
At Airedale, chillers heading from our plants to data centre clients are almost always designed and specified to above 38° C ambient, with operation up to 45°C, and sometimes 50°C even in Europe. Increasing these thresholds is achieved by improving the design of the chiller, with consideration given to mechanical, electrical and controls upgrades, such as:
- Provision of enhanced control panel cooling.
- Uprated electrical componentry.
- Larger diameter fans specified in order to increase condenser airflow.
- Increased pressure ratings to widen operational envelope.
- Intelligent head pressure management in normal operation.
- Cooling System Optimiser controls platform to utilise all available chillers, even those providing redundancy, to reduce the maximum load on individual chillers during times of peak ambient.
2. Intelligent Facility Design
Spatial requirements of data centre facilities are built primarily with the white space in mind, but care also has to be taken when considering the size and layout of the chiller compound. Chillers themselves create heat and often, chillers are positioned too close to one another, reducing natural air circulation and creating a microclimate in the compound. This can result in a significant difference between ambient temperature and onto-the-coil temperature. On typical data centre sites, there is a 2°C uplift due to recirculation, but we have seen uplifts of 6 or 7°C where space between chillers is tight. This results in an increase in the number of hours where the chiller is exposed to temperatures outside of its comfortable operating envelope and therefore its need for supplementary water. As well as providing sufficient gaps between chillers, blanking plates can be used to reduce air recirculation onto condenser coils.
3. Intelligent Adiabatic Systems
Where additional water is required, perhaps on legacy sites where chillers were installed before designing to increased ambient temperatures became the norm, it is important that the adiabatic systems installed are as water conservative as possible, with intelligent controls used to optimise their use.
Minimising water use by employing a strategy that only activates adiabatic cooling when absolutely necessary is important in conserving water. This is in place of a strategy that uses the adiabatic system to increase overall efficiency. After all, water is as valuable a resource as power, if not more valuable.
In London, this could mean that the number of hours during the year that adiabatic cooling is deployed runs into the tens of hours, compared to hundreds of hours if it was being used as an energy efficiency boost all year round.
Finally, the amount of temperature suppression being targeted is also important. Targeting high levels of suppression means more water is required and typically also leads to “over-spray”, where proportionally more water needs to be sprayed than is actually absorbed into the air, with the over spray being wastage.
Airedale’s stance on cooling systems has always been that intelligent system design can deliver the energy efficiencies data centre operators are striving for, without the need for supplementary water use. Airedale have installed modern closed loop cooling systems, that recirculate water rather than waste it, using optimised chillers, CRAHs and software systems, that are delivering fantastic PUEs in projects across the globe.
However, we cannot ignore that the world is changing. Extreme summer weather events are becoming more commonplace and are on the data centre industry’s doorstep, affecting growth epicentres in Europe, the US and beyond. Consideration can be given to chiller design on new projects, but the vast installed base needs attention in order to keep facilities from falling over during hot weather. Adiabatic cooling, deployed as a peak lopping method, is necessary in some cases to take the strain. If this is deployed in an intelligent way, as a last resort, then there is no reason why utilities like Thames Water cannot operate in harmony with the data centre industry.