AI Is Exploding Data Center Energy Use. A Google-Created Technique May Help
26.02.2024 - 02:48
/ tech.hindustantimes.com
/ Jensen Huang
Tech giants are racing to ward off a carbon time bomb caused by the massive data centers they're building around the world.
A technique pioneered by Google is gaining currency as more power-hungry artificial intelligence comes online: Using software to hunt for clean electricity in parts of the world with excess sun and wind on the grid, then ramping up data center operations there. Doing so could cut carbon and costs.
There's an urgent need to figure out how to run data centers in ways that maximize renewable energy usage, said Chris Noble, co-founder and chief executive officer of Cirrus Nexus, a cloud-computing manager tapping data centers owned by Google, Microsoft and Amazon.
The climate risks sparked by AI-driven computing are far-reaching — and will worsen without a big shift from fossil fuel-based electricity to clean power. Nvidia Corp. Chief Executive Officer Jensen Huang has said AI has hit a “tipping point.” He has also said that the cost of data centers will double within five years to power the rise of new software.
Already, data centers and transmission networks each account for up to 1.5% of global consumption, according to the International Energy Agency. Together, they're responsible for emitting about as much carbon dioxide as Brazil annually.
Hyperscalers — as the biggest data center owners like Google, Microsoft and Amazon are known — have all set climate goals and are facing internal and external pressure to deliver on them. Those lofty targets include decarbonizing their operations.
But the rise of AI is already wreaking havoc on those goals. Graphics processing units have been key to the rise of large language models and use more electricity than central processing units used in other forms of computing. Training an AI model uses the more power than 100 households in a year, according to IEA estimates.
“The growth in AI is far outstripping the ability to produce clean power for it,” he said.
Moreover, AI's energy consumption is volatile and more akin to a sawtooth graph than a smooth line that most data center operators are used to. That makes decarbonization a challenge, to say nothing of ensuring grid stability.
AI's growth is being driven by North American companies, keeping computing power — and energy usage — concentrated there, said Dave Sterlace, account director for global data centers at Hitachi Energy. That's a trend he didn't expect two years ago.
To lower data center CO2 emissions, hyperscalers and other data center providers have financed massive amounts of solar or wind farms and used credits to offset emissions. (In the case of credits, some have failed to have a meaningful impact on emissions.)
But that alone won't be enough, especially as AI use ticks up. That's why