An AI data center can consume up to 5 million gallons of water per day, as much as a town of 50,000 people (EESI). As facilities scale to meet the surging demand for GPUs (Graphics Processing Units), their water usage has become one of the most pressing and overlooked environmental costs of AI.
This week’s newsletter breaks down how AI data centers use water, why it matters, and what we’re doing to stop the leak.
Aren’t they just computers? Why do data centers need water in the first place?
GPUs running at full load generate enormous amounts of heat. To keep them from failing, data centers must continuously cool down their facilities and equipment. The cheapest, most common solution is water.
Most facilities rely on evaporative cooling. Water is pumped into chillers and cooling towers, where it absorbs heat before evaporating and being dumped back into the local water system at higher temperatures. Only 20% of the water used during the process is recovered, while the other 80% evaporates away into the air. This process causes strain on local water supplies.
Source: EIDA
How much water are we talking?
The numbers are pretty crazy:
The average 100‑megawatt data center can use up to 500,000 gallons of water per day, enough for approximately 6,500 households (SeattlePi)
One Google facility in Iowa consumed 1 billion gallons in 2024, making it the highest-use site in Google’s network (SeattlePi)
Unlike power, which can be added more easily across regions, water is fundamentally local. That makes siting decisions far more complex. In Chile, Arizona, and the Netherlands, new AI data centers have already sparked public backlash over water allocation during drought years, with some data center projects even being scrapped or banned due to concerns about water availability (Fox).
Why it matters
Water has become the second most critical input for AI infrastructure, alongside electricity. And unlike grid power, water rights are often tied up in legal frameworks designed for agriculture and municipalities.
Cities are facing tough choices on whether water should support farms, residents, or data centers.
Regulators are adding water usage reviews to environmental permitting, delaying projects.
Developers now need to consider water availability as much as they already consider land and grid access.
The issue will only get bigger: global data center water consumption is projected to reach more than a trillion gallons per year by 2030 if current growth continues (EESI).
What’s being done?
The industry is testing alternatives, but adoption is slow:
Closed-loop cooling reduces evaporative losses significantly by recycling water (DOE).
Immersion cooling submerges chips in dielectric fluid, cutting water use by over 90% (Intel).
Air-based or water-free cooling is being tested but requires higher upfront costs (Meta).
The challenge is scaling these solutions fast enough to catch up with the rapid pace of data center construction. Most facilities today still depend on traditional evaporative cooling, and many are still being built in regions with high water stress because of cheap land and tax incentives. The fast adoption of these water-saving technologies is crucial as we build out data center capacity.
What does this mean for you?
Cities and utilities: Expect rising conflict over water allocation as data centers compete with agriculture and residents.
Regulators and policymakers: Treat water as critical infrastructure in permitting, just like grid power.
Developers and EPCs: Cooling design will become a differentiator; expect more scrutiny from communities and regulators.
Manufacturers: Huge market opportunity for water-efficient cooling systems and alternative technologies.
Sources
Data centers consume massive amounts of water – companies rarely tell the public exactly how much
How liquid cooling is transforming data center construction | EIDA
Tucson votes down $250M Amazon data center citing water and energy use | Fox Business
Cooling Water Efficiency Opportunities for Federal Data Centers | Department of Energy
Intel and Shell Advance Immersion Cooling in Xeon-Based Data Centers - Intel Newsroom
Simulator-based reinforcement learning for data center cooling optimization - Engineering at Meta



