(Image Credit: geralt/pixabay)
I thought crypto mining was bad…
Now that AI is advancing In varying fields like coding, driving, etc., many people have expressed concern about its power consumption potential in the future. In a recent peer-reviewed analysis, Alex de Vries explained how AI applications, with widespread adoption, could consume more power than certain countries. It states that by 2027, AI servers could use 85 to 134 Twh per year. That’s the same amount used by Sweden, the Netherlands, and Argentina annually.
We’ve seen how generative AI tools, like ChatGPT, rapidly increased in popularity over the past year. And training these AI programs that process large amounts of data makes them power-hungry in the first place. For example, Hugging Face’s text-generative AI tool ate up 433 MWh while it underwent training. That’s enough to power forty average homes in the US for one year. While running, the AI tools also demand a lot of computer power per operation, which consumes energy. In this case, ChatGPT would need 564 MWh daily just to run constantly.
Nowadays, there’s a worldwide call to improve AI tools’ efficiencies to reduce energy costs. But the Jevons’ Paradox, which suggests that more efficient technology requires higher demand for resources, may get in the way. Google is a perfect example of this as it incorporates AI-generative tools in its email and AI-powered search engine capabilities.
Consider that 9 billion Google searches take place every day. According to de Vries’ estimates, integrating AI into every search would potentially require 29.2 TWh of power. That’s the same amount of electricity Ireland consumes per year.
Last year, data centers powering computers like Google’s search engine and Amazon’s cloud consumed 1% to 1.3% of the world’s power. That doesn’t take cryptocurrency mining, which ate up an additional 0.4%, into consideration. We can’t exactly measure how much energy AI uses since companies only mention how many special chips run the software. As a result, de Vries used NVIDIA A100 server sales projections to calculate the energy consumption.
He references a projection that estimates NVIDIA could ship 1.5 million A100 servers by 2027. He then multiplied 1.5 million by the servers’ energy consumption. Several cautionary notes came up as a result. For instance, server cooling would use up more energy, even if customers don’t use the servers at 100% capacity.
Have a story tip? Message me at: http://twitter.com/Cabe_Atwell