AI Water Usage: AI Water Footprint Sustainability, Management & Future Directions
Increasing AI use is big news. Everywhere and everyone is offering an AI solution for almost every imaginable aspect of life, work and play. To meet demand, bigger and bigger data centres are being constructed to handle the billions of computations every second that fuel our need for computing power.
Of course, AI computation is also competing with your everyday bog-standard computing needs, like cloud storage, searches, storage, and so on. And this growing need requires space: more land for bigger data centres, more servers, more facilities to make the magic happen. It also requires huge amounts of energy to power it all – from AI diagnostics for smart sewers, to medical analysis, from space exploration to turning your friend into a cartoon Tolkien character, there seems to be no end to our demands.

AI and machine learning are widely used in the water industry, for example, powering digital twins, for flood prediction and prevention, plant and resource condition monitoring and analysis, and more. And despite the concerns over water and energy consumption, AI has the potential to significantly enhance water conservation efforts.
Water use in AI data centres is also a very hot topic. We know data centres have traditionally been huge water guzzlers, with AI set to increase that demand, but there are widely contrasting views as to just how much water is consumed and whether the volumes mentioned are dangerously high or completely acceptable.
Understanding AI’s water consumption
Let’s take a look in more detail at AI’s water consumption to help us understand more about its water footprint.
How does AI use water?
According to the OECD, AI primarily consumes water in two ways:
- For on-site server cooling: this is a major consumer of water, especially when cooling towers are used to help the overall heat transfer away from servers and into the environment, as the method relies on evaporation to remove heat. Some estimates suggest a single tower can work through 19,000 litres every minute. Other onsite cooling includes on-chip liquid cooling and closed-loop systems, which, although they involve water being reused a number of times, traditionally require pure water that then needs re-treating once it is expelled from the system.
- For off-site electricity generation: AI data centres consume huge amounts of energy to run, and while some newer centres include green energy sources, many require electricity to power their servers, which also requires large volumes of water. Furthermore, the manufacture of microchips that drive AI requires the use of water on a vast scale.
How much water does AI use?
AI’s water footprint will vary according to where a data centre is located. For example, AI consumes 1.8 to 12 litres of water for each kWh of energy usage across Microsoft’s global data centres, with Ireland and the state of Washington being the most and least water-efficient locations, respectively.
The on-site water consumption may be higher or lower than off-site water consumption, depending on the data centre cooling technique and how electricity is generated in the local grid. For example, if a cooling tower is used for data centre cooling and the local grid primarily uses solar and wind energy, the scope-1 onsite water consumption can dominate.
According to the World Economic Forum, a 1-megawatt (MW) data centre can use up to 25.5 million litres of water annually just for cooling – this is equivalent to the daily water consumption of approximately 300,000 people. This water consumption exacerbates water stress, especially in vulnerable regions already facing shortages.
GPT-3, an AI model, is estimated to consume 500ml of water per 10-50 responses. When factoring in the reported 100 million users having multiple conversations a day, the total water footprint of AI becomes enormous. Later iterations of the model are predicted to require even more water.
Even a traditionally average-sized data centre (15MW) will use as much water as the yearly consumption of three average-sized hospitals or more than two 18-hole golf courses. But this is small-scale compared to the so-called hyperscalers being built to manage future AI systems, which can reach 150MW and beyond.
Researchers from the University of Colorado Riverside and the University of Texas, Arlington go further. Using public data sources, they estimate that training GPT-3 in Microsoft’s state-of-the-art US data centres can directly consume 700,000 litres of clean freshwater (the equivalent, they state, could be used to produce 370 BMW cars or 320 Tesla electric vehicles).
According to the OECD report cited above, each AI chip takes approximately 2,200 gallons of Ultra-Pure Water (UPW) to produce. You can extend that further by including the resource costs to produce the UPW.
In the UK, data centres are estimated to use nearly 10bn litres of water every year. This has led to some concerns among water utilities that new data centres will be too water-thirsty and that supply will not be able to keep up with AI’s water consumption demands. Anglian Water has recently objected to a proposed data centre in North Lincolnshire, which it describes as ‘one of the driest parts of the country’.
However, a survey of 73 data centres in England, conducted by TechUK, found that these fears may be overstated. It found that:
- 51 per cent of surveyed sites use waterless cooling systems.
- 64 per cent use less than 10,000 m³ of water per year – less than a typical leisure centre.
- 89 per cent of sites either measure water use or deploy systems that do not require water for cooling.
- Only 4 per cent of sites report using over 100,000 m³ annually.
Clearly, sustainability will be a key factor in the growth of AI data centres of the future.
Key metrics for measuring AI water footprint
The way that water is used can be broken down in terms of withdrawal and consumption:
- Water withdrawal: freshwater taken from the ground or surface water sources, either temporarily or permanently.
- Water consumption: the amount of water evaporated, transpired, incorporated into products or crops, or otherwise removed from the immediate water environment. This is often used as a water footprint value.
The OECD uses the following formula for calculating AI’s water footprint in data centres:
- WaterFootprint = ServerEnergy*WUEOnsite+ServerEnergy*PUE*WUEOffsite
Server energy in this formula can be measured using built-in sensors. WUE onsite measures the water usage efficiency for cooling systems, while PUE (Power Usage Effectiveness) measures non-IT energy overheads, such as cooling energy and power distribution losses. Offsite WUE measures water efficiency for electricity generation.
AI & water consumption beyond prompts
Water needed for data centres hosting AI systems
Annually, the global data centre sector consumes more than 560 billion litres of water. According to the International Energy Agency (IEA), projections indicate this figure could rise dramatically, reaching as high as 1,200 billion litres by 2030. The impact of individual hyperscale data centres is even more pronounced, with a 100MW facility capable of consuming around 2.5 billion litres of water annually, equivalent to the needs of approximately 80,000 people.
A report by the UK government states that leading technology and hyperscale data centre providers, at the forefront of AI development, have reported substantial year-on-year increases in their data centre water consumption. Microsoft's global water use, primarily for its cloud data centres, increased by 34 per cent in 2022, reaching 6.4 million cubic metres. Google's data centres consumed 19.5 million cubic metres of water in 2022, marking a 20 per cent increase. This highlights the escalating demand for water driven by AI workloads.
Impact of AI advancement on water withdrawal rates
While there have been changes in terms of the type of water used in data centres, most still use potable water. In areas suffering from water stress, this creates a rivalry for resources between data centres and human consumption.
This competition is likely to become a pressure point while data centres are still huge consumers of water, with global AI demand projected to account for somewhere in the region of 4.2 to 6.6 billion cubic meters of water withdrawal by 2027, which is more water withdrawn than Denmark in a typical year.
In the US, the Texas Water Development Board expects that data centres in the state will consume 49bn gallons of water in 2025, and estimates that this will rise to nearly 400bn by 2030. By that date, the state’s data centres will be using seven per cent of all water in the state.
Competition for resources has already led to one high-profile data centre project being cancelled in the US. Project Blue, which has been linked to Amazon Web Services, was set to cover 290 acres south of Tucson. However, the plan was opposed by city leaders and residents on environmental grounds, including the projected use of 622 million gallons every year, with the first two years using only potable water.
Despite a plan to use reclaimed water after the initial two-year period and the promise of jobs and billions in additional revenue for the city, residents felt that building a thirsty data centre in a desert environment did not make sense to them. Local man, Danny Garcia, speaking at a meeting of residents and city leaders, epitomised the opposition: “Our summer temperatures are hitting record highs, and our water levels are hitting record lows. We no longer have the monsoon rains that had the Santa Cruz River run from bank to bank.”
The World Resources Institute estimates that by 2030, AI infrastructure will consume 1.1 trillion to 1.7 trillion gallons of freshwater annually, roughly equivalent to the yearly water use of all households in California.
AI Water consumption: sustainable cooling
More sustainable methods do exist that use less or no water, but their use depends on local contexts. One such method is the closed-loop system; another is direct-to-chip cooling technology.
Microsoft has set itself the goal of building data centres that lose no water. The company’s sustainability goals for data centres are:
“Although our current fleet will still use a mix of air-cooled and water-cooled systems, new projects in Phoenix, Arizona, and Mt. Pleasant, Wisconsin, will pilot zero-water evaporated designs in 2026. Starting August 2024, all new Microsoft data centre designs began using this next-generation cooling technology, as we work to make zero-water evaporation the primary cooling method across our owned portfolio. These new sites will begin coming online in late 2027.”
Microsoft is using closed-loop systems that repeatedly use the same water to transfer heat away from the heat-generating chips. Rack and server designs are being developed to accommodate new methods of thermal management, as well as power management. These methods include:
- Cold plates: According to Microsoft, this is a direct-to-chip cooling technology that provides heat exchange in a closed-loop system. They dissipate heat more effectively than traditional air cooling, directly chilling the silicon and then recirculating the cooling fluid, like a car radiator. This solution significantly improves cooling efficiency and enables more precise temperature control compared to traditional methods.
- Sidekicks: This is a liquid cooling system that can be used in existing data centres. It draws heat away from cold plates attached to the surface of the chips.
- Microfluidics: The company is taking cold plate technology further by integrating tiny fluid channels into chip design. By doing so, the coolant is brought right next to the processors.
Wastewater treatment innovations for AI operations
Elon Musk’s huge xAI gigafactory in Memphis has not been short of controversy, including air pollution caused by unpermitted gas turbines used to power it. However, fears that the factory would drain the local aquifer to satisfy its water needs have so far proved unfounded. Instead, Musk is building a €69m wastewater plant, which will treat wastewater and repurpose it for xAI’s uses. Cooling systems typically need ultra-pure water, which has the potential to drive advances in wastewater treatment technologies, such as ceramic membranes.
Recycling and using treated wastewater are both becoming more common in data centres. Bridge Data Centres in Malaysia is one such company that announced it will use treated effluent as reclaimed water for use in its closed-loop cooling systems.
Amazon Web Services announced in 2024 that it was expanding its use of recycled water to 100 data centres in the US. This, the company stated, would preserve 530m gallons of drinking water in the communities in which it operates. The driver for technology is the local environment and resources. By using direct evaporative cooling, where hot air from outside is pushed through water-soaked cooling pads, before it evaporates and cools the temperature of the air sent to the server rooms, the company states it has reduced its annual water use in data centres by 85 per cent.
Where conditions allow, such as in Ireland and Sweden, this method is combined with free-air cooling, with evaporation only used when needed. This has resulted, at times, in no water being used for 95 per cent of the year.
However, the Black & Veatch 2025 Water Report revealed that water companies in the USA are under-prepared for the demands that the AI data centre sector is likely to place on them. The report states that:
- More than half (54 per cent) of respondents said “no” when asked if their organisation has factored in the rise of data centres and technical manufacturer water needs into their short and long-term resource planning.
Water conservation strategies for AI companies
Many of the big tech companies are involved in data centres and each has sustainability frameworks, goals and targets that involve water conservation. Amazon, Google, Meta and Microsoft all monitor water use and work on improving efficiencies within data centres. They also work on water conservation in the immediate surroundings. For example, rainwater harvesting and recycling, and also undertake water stewardship projects in water basins in which they operate.
Within data centres, the use of closed-loop systems removes the need for cooling towers, which lose water through evaporation. By keeping the same water in the system for upwards of 15 years, the need for constant consumption of fresh water is removed.
Environmental impact assessment of AI water usage
Relationship between AI water usage and carbon footprint
In 2021, 0.5 per cent of all greenhouse gas emissions were attributed to data centres; however, this figure is estimated to increase tenfold by 2040.
Amazon’s carbon emissions rose for the first time in three years in 2024, driven in part by data centre construction.
Effects on local water resources and ecosystems
Water extraction from groundwater sources has the potential to cause environmental harm, such as aquifer depletion. Research conducted by SourceMaterial and The Guardian newspaper revealed that 38 active datacentres owned by three firms are operating in parts of the world facing water scarcity, with another 24 under development.
Environmental threats also come from water discharged from cooling towers, which can contain high mineral levels and pH levels that might be dangerous to wildlife.
Climate change implications of AI water consumption
Increased water consumption and extraction have the potential to add to the climate crisis. Opponents to Amazon’s data centres in the Aragon region of Spain, including campaign group Tu Nube Seca Mi Río – Spanish for “Your cloud is drying my river” – have called for a moratorium on new datacentres due to water scarcity.
The centres are licensed to use 755,720 m3 of water a year, enough to irrigate 233 hectares (576 acres) of corn, one of the region’s main crops. Amazon has recently applied to increase consumption by 48 per cent, citing rising temperatures as a need to use more water.
However, The Guardian notes that 75 per cent of the country is at risk of desertification, with Lorena Jaume-Palasí, founder of The Ethical Tech Society, claiming the centres were “bringing Spain to the verge of ecological collapse”.
Added to the risk of increased water scarcity, the report suggests that the data centres are predicted to use more electricity than the entire region currently consumes.
Freshwater supply challenges related to AI expansions
The SourceMaterial report highlighted that Amazon, Microsoft and Google, between them, have plans to in the number of datacentres they own by almost 80 per cent as demand grows for cloud and AI services. Some of this expansion will come in areas that are facing water stress, including Arizona. Once again, this expansion sets up a competition for resources should freshwater be required for data centre cooling.
A drought in Arizona has not stopped Meta from opening €859m data centre in the city. The region is becoming a data centre hub, and while state officials revoked a permit to construct new homes in Maricopa County, citing a lack of groundwater, Microsoft operates two data centres in the same area. Google’s data centre has a permit to use 5.5 m/m3 of water a year, which is roughly the equivalent of the water used by 23,000 residents.
AI water footprint policy and governance
Regulatory frameworks addressing AI water usage
In the US, there are no federal regulations for AI and no legal framework requiring tech companies to disclose their energy and water consumption. Individual states are left to decide whether to implement regulations. In New York, legislation requires data centre operators to submit annual reports on their water and energy use, as well as their sustainability efforts. It also imposes limitations on the construction of new sites, with operators mandated to report projected water and energy use.
The European Commission wants data centres to be highly energy-efficient and sustainable by no later than 2030. The commission is preparing a report for the European Parliament assessing the feasibility of a transition towards a net-zero emission data centres sector. This will be based on data reported by data centre operators under the Common Union Rating Scheme for Data Centres Regulation.
However, some believe net zero will be impossible to achieve with more advanced AI models. Benjamin Lee, a professor of electrical and systems engineering at the University of Pennsylvania, told NPR: “I think before generative A.I. came along in late 2022, there was hope among these data centre operators that they could go to net zero. I don't see how you can, under current infrastructure investment plans, you could possibly achieve those net zero goals.”
Future directions of AI and water consumption
Taking AI data centres to the oceans
A report in Scientific American states that China is piloting a project to place data centres on the seabed. To avoid using water that might otherwise be used for human consumption and agriculture, the country is constructing a wind-powered underwater data centre located six miles off the coast of Shanghai, which the report notes is one of China’s AI hubs.
Inspiration seems to have come from a mothballed Microsoft project located off the coast of Scotland. The project began in 2015, and by the time the Data Centre Dynamics website profiled the project in 2021, it had already ended.
Project Natick involved 855 servers being submerged and left unattended for just over two years; the data centres were filled with inert nitrogen gas. At the same time, a comparison project was located on land featuring 135 servers in normal data centre conditions running Microsoft’s Azure cloud software. The results made interesting reading: when the submerged servers were brought to the surface, only six of the 855 had broken, compared to eight of the 135 land-based servers.
Steady external temperatures were attributed to this success, and Microsoft have said that although it has no immediate plans to submerge future data centres, it is using data and findings from the project to feed into potential future applications.
Submerging data centres is currently being explored elsewhere; for example, South Korea is working on a project that will result in an eco-friendly technology for constructing an underwater data centre complex that could house up to 100,000 servers on the seabed. The first test module is scheduled to be deployed 30 meters underwater off the coast of Sinri Port in Seosaeng-myeon, Ulju-gun, Ulsan, and is unique among current projects in that it will include living quarters for three researchers and engineers.
Floating data centres
Yokohama in Japan is looking to green-energy powered floating data centres to meet its data centre needs. A collaboration will see several companies and Yokohama’s city administrators work on the project that will see a 25m-long by 80 m-wide floating data centre located off Osanbashi Pier. Future plans include locating such floating data centres near offshore wind farms to meet energy needs. While no mention is made of using seawater to cool the data centres, or how such a ‘green’ project might reduce water use in AI data centres, it does hold the potential for moving them away from water-stressed land areas and for potentially making use of seawater to cool servers with some element of submersion.
Floating data centres as an idea is not new, but it is still in its infancy compared to the huge land-based constructions that are being built. American company Nautilus began a floating data centre pilot in 2015, using water taken directly under the barge to cool the server racks via heat exchange before it is returned to the ocean.
Balancing AI advancement with water conservation
Many new data centre technologies are using less water than in previous generations, with some using no water at all. However, increasing AI demand will increase the need for innovative cooling strategies and wider water conservation measures. These might include:
- The use of closed-loop cooling systems to significantly reduce usage
- Recycling and reuse technologies, including greywater treatment
- Real-time water monitoring to detect losses and optimise flow
- Local water resilience strategies, especially in drought-prone areas
- Consider water availability within the catchment when determining the most appropriate location for data centres
- Share tools, technologies, and best practices to drive industry-wide improvements in water efficiency and sustainability.
Conclusion
Demand for faster, more powerful AI applications is driving widespread construction of ever-bigger data centres. Water consumption in data centres is problematic for many reasons, such as using potable water, at times in direct competition with the needs of people who live near the facilities. However, technological improvements are reducing water consumption in modern data centres, with new water sources and more targeted cooling techniques helping to reduce demand. At the same time, AI models have huge potential in solving many of the world’s water problems, including reducing water consumption in AI data centres.
AI use and advances in the technology facilitating it and enabling it look set to increase exponentially in the coming years, but AI’s demand on stressed water resources may not necessarily increase at the same rate; it may even create the means to significantly reduce consumption and the impact data centres have on the local environments in which they operate. Increased use of reclaimed water will also present challenges and opportunities for water tech innovators and utilities to meet the demand for either local or on-site resources for cooling energy-intensive servers and chips.

Villain to value: algal wastewater treatment
22 June 2023

30 technologies for the circular economy
08 June 2023

How could carbon diversion unlock energy savings for WWTPs?
11 May 2023

How has innovation bolstered the booming biogas recovery market?
27 September 2023

CBAT looks to bypass brine challenges as new era of potable reuse beckons
02 July 2024