AI’s craving for data is matched only by a runaway thirst for water and energy

One of the most pernicious myths about digital technology is that it is somehow weightless or immaterial. Remember all that early talk about the “paperless” office and “frictionless” transactions? And of course, while our personal electronic devices do use some electricity, compared with the washing machine or the dishwasher, it’s trivial.

Belief in this comforting story, however, might not survive an encounter with Kate Crawford’s seminal book, Atlas of AI, or the striking Anatomy of an AI System graphic she composed with Vladan Joler. And it certainly wouldn’t survive a visit to a datacentre – one of those enormous metallic sheds housing tens or even hundreds of thousands of servers humming away, consuming massive amounts of electricity and needing lots of water for their cooling systems.

On the energy front, consider Ireland, a small country with an awful lot of datacentres. Its Central Statistics Office reports that in 2022 those sheds consumed more electricity (18%) than all the rural dwellings in the country, and as much as all Ireland’s urban dwellings. And as far as water consumption is concerned, a study by Imperial College London in 2021 estimated that one medium-sized datacentre used as much water as three average-sized hospitals. Which is a useful reminder that while these industrial sheds are the material embodiment of the metaphor of “cloud computing”, there is nothing misty or fleecy about them. And if you were ever tempted to see for yourself, forget it: it’d be easier to get into Fort Knox.

OpenAI’s boss warned the next wave of AI will consume vastly more power than expected. Energy systems will struggle to cope

There are now between 9,000 and 11,000 of these datacentres in the world. Many of them are beginning to look a bit dated, because they’re old style server-farms with thousands or millions of cheap PCs storing all the data – photographs, documents, videos, audio recordings, etc – that a smartphone-enabled world generates in such casual abundance.

But that’s about to change, because the industrial feeding frenzy around AI (AKA machine learning) means that the materiality of the computing “cloud” is going to become harder to ignore. How come? Well, machine learning requires a different kind of computer processor – graphics processing units (GPUs) – which are considerably more complex (and expensive) than conventional processors. More importantly, they also run hotter, and need significantly more energy.

On the cooling front, Kate Crawford notes in an article published in Nature last week that a giant datacentre cluster serving OpenAI’s most advanced model, GPT-4, is based in the state of Iowa. “A lawsuit by local residents,” writes Crawford, “revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use – increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports.”

Within the tech industry, it has been widely known that AI faces an energy crisis, but it was only at the World Economic Forum in Davos in January that one of its leaders finally came clean about it. OpenAI’s boss Sam Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. “There’s no way to get there without a breakthrough,” he said.

What kind of “breakthrough”? Why, nuclear fusion, of course. In which, coincidentally, Mr Altman has a stake, having invested in Helion Energy way back in 2021. Smart lad, that Altman; never misses a trick.

As far as cooling is concerned, it looks as though runaway AI also faces a challenge. At any rate, a paper recently published on the arXiv preprint server by scientists at the University of California, Riverside, estimates that “operational water withdrawal” – water taken from surface or groundwater sources – of global AI “may reach [between] 4.2 [and] 6.6bn cubic meters in 2027, which is more than the total annual water withdrawal of … half of the United Kingdom”.

Given all that, you can see why the AI industry is, er, reluctant about coming clean on its probable energy and cooling requirements. After all, there’s a bubble on, and awkward facts can cause punctures. So it’s nice to be able to report that soon they may be obliged to open up. Over in the US, a group of senators and representatives have introduced a bill to require the federal government to assess AI’s current environmental footprint and develop a standardised system for reporting future impacts. And over in Europe, the EU’s AI Act is about to become law. Among other things, it requires “high-risk AI systems” (which include the powerful “foundation models” that power ChatGPT and similar AIs) to report their energy consumption, use of resources and other impacts throughout their lifespan.

It’d be nice if this induces some investors to think about doing proper due diligence before jumping on the AI bandwagon.

What I’ve been reading

Ragged philanthropist
Read Deborah Doane’s sharp review for Alliance magazine of Tim Schwab’s critical book, The Bill Gates Problem.

Last writes
Veteran commentator Jeff Jarvis ponders giving up “on old journalism and its legacy industry” in a blog post at BuzzMachine.

Slim pickings
On his blog No Mercy/No Malice, Scott Galloway suggests AI and weight-loss drugs have much in common.

Leave a Comment