The artificial intelligence (AI) industry could consume as much energy as a country the size of the Netherlands by 2027, a new study warns.

Big tech firms have scrambled to add AI-powered services since ChatGPT burst onto the scene last year.

They use far more power than conventional applications, making going online much more energy-intensive.

However, the study also said AI's environmental impact could be less than feared if its current growth slowed.

Many experts, including the report author, say such research is speculative as tech firms do not disclose enough data for an accurate prediction to be made.

There is no question, though, that AI requires more powerful hardware than traditional computing tasks.

The study, by Alex De Vries, PhD candidate at the VU Amsterdam School of Business and Economics, is based on some parameters remaining unchanged – such as the rate at which AI is growing, the availability of AI chips, and servers continuing to work at full pelt all the time.

Mr De Vries considered that the chip designer Nvidia is estimated to supply about 95% of the AI processing kit required by the sector.

By looking at the amount of these computers it is expected to deliver by 2027, he was able to approximate a range for the energy consumption of AI of 85-134 terrawatt-hours (TWh) of electricity each year.

At the top end that is roughly the amount of power used annually by a small country.

"You would be talking about the size of a country like the Netherlands in terms of electricity consumption. You're talking about half a per cent of our total global electricity consumption," he told BBC News.

Nvidia declined to comment.

Mr De Vries said his findings showed that AI should be used only where it is really needed.

His peer-reviewed study has been published in the journal Joule.

How much energy – and water – does AI use?

AI systems such as the large language models that power popular chatbots, like OpenAI's ChatGPT and Google's Bard, require warehouses full of specialist computers – called data centres – to work.

That means the equipment is more power-hungry and, like traditional kit, it also needs to be kept cool, using water-intensive systems.

  • A simple guide to help you understand AI

The research did not include the energy required for cooling. Many of the big tech firms don't quantify this specific energy consumption or water use. Mr de Vries is among those calling for the sector to be more transparent about it.

But there is no doubt demand for the computers that power AI is mushrooming – and with it the amount of energy needed to keep those servers cool.

Danny Quinn, boss of the Scottish data centre firm DataVita, said his company has gone from receiving "one or two enquiries a week" at the start of 2023 about using his facility to house AI kit, to receiving hundreds.

He also described the difference in energy use between a rack containing standard servers, and one containing AI processors.

"A standard rack full of normal kit is about 4kWh of power, which is equivalent to a family house. Whereas an AI kit rack would be about 20 times that, so about 8kWh of power. And you could have hundreds, if not thousands, of these within a single data centre."

Inside DataVita's Fortis data centre in Scotland's central beltImage source, DataVitaImage caption, Inside DataVita's Fortis data centre in Scotland's central belt

He added that Scotland's colder and wetter climate provided a natural advantage in helping the data centres with keeping equipment cool, but it is still a huge task.

In its latest sustainability report, Microsoft, which is investing heavily in AI development, revealed that its water consumption had jumped by 34% between 2021 and 2022, to 6.4 million cubic metres, around the size of 2,500 Olympic swimming pools.

Prof Kate Crawford, who wrote a book about AI and its impact on the environment, said the issue kept her awake at night.

Speaking to the BBC in July, she said: "These energy-intensive systems take enormous amounts of electricity and energy, but also enormous amounts of water to cool these gigantic AI supercomputers. So we are really looking at an enormous extractive industry for the 21st Century."

  • Tech Life: Charting the true cost of AI

But there are also hopes that AI could help solve some of the environmental challenges facing the planet.

Aeroplane contrails glowing in a sunset over rooftopsImage source, Getty ImagesImage caption, AI tools are being used to try to reduce the number of vapour trails left across our skies by aeroplanes

Google and American Airlines recently found pilots could halve the amount of contrails (vapour trails) created by aircraft by using an experimental AI tool to select altitude. Contrails are known to contribute to global warming.

And the US government is among those spending millions of dollars on trying to recreate nuclear fusion – the way the Sun gets its energy.

Success here would be a real game changer, in the form of a limitless, green power supply. AI could speed up the research, which has been going on since the 1960s with very slow progress.

In February this year, university academic Brian Spears said he had used AI to predict an outcome in an experiment which resulted in a breakthrough.

"For 100 trillionths of a second, we produced ten petawatts of power. It was the brightest thing in the solar system," he wrote.

Leave a Reply

Your email address will not be published. Required fields are marked *