Generative AI programmes are using considerable amounts of fresh water resources, a recent study says. The University of California found that roughly 17 ounces (half a litre) of water are used for every 20 to 50 prompts.
Data processing centres consume huge amounts of water in two ways. The first is by using electricity from steam-generating power plants, and the second is by using on-site chillers to keep their servers cool.
Water pumped into data centres often evaporates after cooling the facilities or ends up as ‘blowdown’. This is thick, briny wastewater that needs to be treated. This further adds to the costs of the cooling usage.
This is the first time that researchers have looked at the potential water footprint of the technology. Researchers estimate that in 2021 alone, Google used around 12.7 billion litres of fresh water to keep their servers cool in the USA alone.
This is concerning when water resources are becoming increasingly scarce and valuable, particularly in parts of the world that are prone to drought and desertification. The scientists say that it is important to address the water use from AI because it is a fast-growing segment of computer processing demands.
Satellite images shared by NASA have shown considerable vulnerability to drought in the South West USA. Images of reservoirs shrinking almost before our eyes show the seriousness of this problem.
One of the authors of the research, Shoalei Ren, an associate professor of electrical and computer engineering, says that big tech should take responsibility and lead by example to reduce its water use.
At last month’s World Economic Forum’s annual meeting in Davos, Switzerland, Sam Altman, the CEO of OpenAI, warned that the next wave of generative AI systems will consume vastly more power than expected and that energy systems will struggle to cope.
Usage spikes
And people are beginning to notice on the local level as well. In West Demoines, Iowa, OpenAI has a giant data centre cluster that serves ChatGPT-4. Residents have issued a lawsuit when they discovered that the data centre used 6% of the district’s water in July 2022 during data training. The other inconvenient fact is that this happened in the middle of a three-year drought in the area.
Microsoft and Google had similar massive surges in water (increases of 20% and 34% in just a single year), according to their own environmental reports. Researchers estimate that by 2027, the global AI demand for water could be as much as half of the UK’s total annual water consumption. If this is true then we are headed towards a very slippery slope.
Cooler hours
However, the researchers do have some ideas on how to solve these issues. One solution suggested by the paper is to allow AI generators to process much of the work during cooler hours such as night-time.
“AI training is like a big, very lawn and needs lots of water for cooling,” Ren says. “We don’t want to water our lawns during the noon, so let’s not water our AI (at) noon either.”
It’s a nice idea. However, I doubt in our ‘faster and now’ society, people will be willing to wait for their AI generated images or ChatGPT texts. But if we aren’t careful and keep using up water for AI, we may end up having to add insects as a viable food source.
[via ucriverside news]