Kanchit Kirischhalar
A criticism of the rise of artificial intelligence is the amount of energy needed to power the technology, but the solution may lie within the technology itself.
The International Energy Agency predicts that data centers’ electricity use will double by 2026, up from 2% of global electricity use in 2022. But research shows that AI adoption is driving energy efficiency gains in other industries that could more than offset AI’s power-hungry nature.
According to a Lisbon Council study, “even if the prediction that data centers will account for 4% of global energy consumption in the near future comes true, AI will have a significant impact on reducing the remaining 96% of energy consumption.”
And with faster computing using parallel processing on Nvidia (NASDAQ:NVDA) GPUs, you can get more done in less time, using less energy than with a CPU built to handle one task at a time.
“While the energy consumption of training large language models has increased substantially, the pace of growth in overall energy consumption has been much slower than the pace of growth in computing requirements and performance due to rapid innovations in hardware and software such as accelerated computing,” the study noted.
According to an Nvidia blog post, “By shifting from CPU-only operation to GPU-accelerated systems, HPC and AI workloads could save more than 40 terawatt-hours of energy per year, equivalent to the electricity needs of approximately 5 million U.S. homes.”
Another study by the Center for Data Innovation found that some of the arguments about AI’s excessive energy use were unfounded, citing a 1990s Forbes magazine article that predicted that the internet would consume half of all power grid usage within the next decade.
“Recently, interest in artificial intelligence has led people to again ask questions about the emerging technology’s energy use,” the study states. “However, as with past technologies, many of the early claims about AI energy consumption turn out to be exaggerated and misleading.”
In fact, the rise of AI has led many large technology companies to pursue carbon neutral corporate goals. This list includes tech giants such as Apple (AAPL), Microsoft (MSFT), IBM (IBM), Dell Technologies (DELL), Google (GOOG) (GOOGL), Meta (META) and Intel (NASDAQ:INTC).
Wind and solar accounted for 14.1% of electricity production last year, according to the Energy Information Administration, but that’s not enough for energy-hungry tech companies to meet carbon-neutral goals, leading some to turn to nuclear power.
Earlier this year, Amazon Web Services (NASDAQ:AMZN) purchased a data center operating at Talen Energy’s nuclear plant in Salem Township, Pennsylvania. NextEra Energy (NYSE:NEE) is also considering restarting a decommissioned nuclear plant in Iowa, citing increased power demand from AI data centers.
And of course, tech companies themselves are finding ways to make their products more efficient: Nvidia said its GPUs have made large language models 45,000 times more efficient over the past eight years.
“If cars became as efficient as NVIDIA has made AI more efficient on its accelerated computing platform, cars would achieve 280,000 miles per gallon,” NVIDIA said.
Super Micro Computer (NASDAQ:SMCI) is also working with GPU makers Nvidia, AMD (NASDAQ:AMD) and Intel to develop liquid-cooled AI superclusters, which could reduce data center energy bills by up to 40%.