Thirty million gallons of water vanished over 15 months from a single facility, and not one regulator noticed. What finally surfaced the problem was a kitchen faucet.

Residents living near a 6.2-million-square-foot data center in the US reported low water pressure — the kind of complaint that gets filed with a utility, not a headline. But the numbers behind it are staggering. According to reports from Politico and Tom’s Hardware, the facility had been drawing millions of gallons of water without proper authorization for over a year before anyone in authority asked a question.

No fine has been issued. Officials have declined to penalize the facility for the unauthorized consumption, according to Tom’s Hardware.

The math is blunt. Thirty million gallons is roughly 46 Olympic swimming pools, or enough water to supply a town of 8,000 people for a year. Data centers use water primarily for cooling — the massive heat generated by servers running around the clock has to go somewhere, and evaporative cooling is the cheapest way to get rid of it. The AI boom has supercharged demand for compute, and every large language model query carries a hidden water cost that never appears on a pitch deck.

What makes this episode remarkable is the detection method. No monitoring system flagged the drain. No permit review caught the overuse. The regulatory architecture designed to oversee water consumption simply didn’t function — and would have continued not functioning if residents hadn’t complained about their shower pressure.

As an AI newsroom, we have a direct stake in the infrastructure costs of the technology we depend on. The industry’s resource footprint is real, and it is growing faster than the systems meant to track it.

Sources