AI and accelerated computing are the twin engines that NVIDIA is continually improving, helping to drive energy efficiency across many industries.
This is progress that the wider community is beginning to recognise.
“Even if predictions that data centers will account for 4% of global energy consumption in the near future come true, AI will have a significant impact on reducing the remaining 96% of energy consumption,” said a report by Lisbon Council Research, a nonprofit organization founded in 2003 that studies economic and social issues.
The paper, by a Brussels-based research group, is one of a few big-picture studies beginning to be published on AI policy, and uses the Italian supercomputer Leonardo, accelerated by about 14,000 NVIDIA GPUs, as an example of a system that advances research in fields ranging from car design and drug discovery to weather forecasting.
The energy efficiency of the most efficient supercomputers on the TOP500 list improves over time. Source: TOP500.org
Why Accelerated Computing is Sustainable Computing
Accelerated computing uses the parallel processing of NVIDIA GPUs to get more work done in less time, resulting in less power consumption than general-purpose servers that use CPUs built to handle one task at a time.
This is why accelerated computing is sustainable computing.
Accelerated systems use parallel processing on the GPU to get more work done in less time and with less energy than a CPU.
The benefits are even greater when accelerated systems apply AI, an inherently parallel form of computing that is the most transformative technology of our time.
“For cutting-edge applications like machine learning and deep learning, GPU performance is orders of magnitude better than CPUs,” the report states.
NVIDIA offers a combination of GPUs, CPUs, and DPUs customized to maximize energy efficiency with accelerated computing.
Accelerated AI User Experience
Users around the world are documenting the energy efficiency gains that come with AI and faster computing.
In the financial services sector, Paris-based Murex, whose trading and risk management platform is used daily by more than 60,000 people, tested the NVIDIA Grace Hopper superchip and found that its workloads, which combined a CPU and GPU, consumed four times less energy and completed seven times faster than a CPU-only system (see graph below).
“Grace is not only the fastest processor for risk calculations, it is also much more power efficient, making green IT a reality in the trading world,” said Pierre Spatz, head of quantitative research at Murex.
In manufacturing, Taiwan-based Wistron built a digital copy of a room to thermally stress test NVIDIA DGX systems to improve on-site operations. The company used NVIDIA Omniverse, an industrial digitization platform, with a surrogate model, a version of AI that emulates a simulation.
The digital twin, linked to thousands of networked sensors, enabled Wistron to improve energy efficiency throughout the facility by up to 10%, equating to an annual reduction in electricity consumption of 120,000 kWh and carbon emissions of 60,000 kilograms.
Reduce carbon dioxide emissions by up to 80%
According to recent benchmarks, the RAPIDS Accelerator for Apache Spark reduces the carbon footprint of data analytics, a widely used form of machine learning, by up to 80%, delivering an average 5x speedup and 4x reduction in compute costs.
Thousands of companies (nearly 80% of the Fortune 500) use Apache Spark to analyze ever-growing mountains of data. Companies using NVIDIA Spark accelerators include Adobe, AT&T, and the U.S. Internal Revenue Service.
In healthcare, Insilico Medicine is using an NVIDIA-powered AI platform to discover potential treatments for rare respiratory diseases and has begun Phase 2 clinical trials.
Using traditional methods, this work would have cost over $400 million and taken up to six years, but using generative AI, Insilico achieved this milestone at one-tenth the cost and in one-third the time.
“This is an important milestone not only for our company but for everyone in the field of AI-accelerated drug discovery,” said Alex Zavoronkov, CEO of Insilico Medicines.
This is just a sample of the outcomes being pursued by users of accelerated computing and AI at companies like Amgen, BMW, Foxconn, and PayPal.
Accelerated science with accelerated AI
In a fundamental study, the National Energy Research Scientific Computing Center (NERSC), the U.S. Department of Energy’s open science lead facility, compared the results of a server equipped with four NVIDIA A100 Tensor Core GPUs against a dual-socket x86 CPU server on four key high-performance computing and AI applications.
Researchers found that apps accelerated by NVIDIA A100 GPUs were, on average, five times more energy efficient (see below), with a weather forecast application recording an almost tenfold improvement.
Scientists and researchers around the world rely on AI and high-speed computing for high performance and efficiency.
In a recent ranking of the world’s most energy-efficient supercomputers, known as the Green500, NVIDIA-powered systems took the top six spots and 40 of the top 50 positions.
Underestimated Energy Savings
Many benefits across industry and science are easily overlooked by projections that estimate only the energy consumption of training the largest AI models, which overlooks the benefits users can gain from AI models consuming relatively little energy over most of their lifetime and achieving the efficiencies described above.
A recent study, in an analysis citing dozens of sources, exposed predictions based on training models as misleading and exaggerated.
“Just as early projections about the energy footprints of e-commerce and video streaming ultimately turned out to be exaggerated, estimates about AI are likely wrong,” said the report from the Information Technology and Innovation Foundation, a Washington-based think tank.
The report noted that 90% of the cost of running AI models, and all of the efficiency gains, are spent on deploying them in applications after training.
“Given the enormous opportunities to leverage AI to benefit our economy and society, including in the transition to a low-carbon future, it is essential that policymakers and the media better scrutinize claims about AI’s environmental impacts,” the report’s authors said, explaining their findings in a recent podcast.
People cite energy benefits of AI
Policy analysts at the R Street Institute in Washington, D.C., agreed.
“Rather than pause, policymakers should help realize the potential benefits from AI,” the group wrote in a 1,200-word article.
“Increasingly faster computing and the rise of AI hold great promise for the future, with significant societal benefits in terms of economic growth and social welfare,” the report said, citing proven benefits of AI in drug discovery, banking, stock trading and insurance.
He added that AI can improve efficiency in the power grid, manufacturing and transportation sectors.
AI supports sustainability efforts
The report also notes the potential for accelerated AI to combat climate change and promote sustainability.
“AI can improve the accuracy of weather models to improve public safety and also generate more accurate forecasts of crop yields. The power of AI will also contribute to the development of more accurate climate models,” R Street said.
The Lisbon report added that AI “will play a key role in the innovation needed to address climate change” in tasks such as finding more efficient battery materials.
How AI can help the environment
The ITIF called on governments to adopt AI as part of their efforts to decarbonise operations.
Public and private organizations are already applying NVIDIA AI to protect coral reefs, improve tracking of wildfires and extreme weather, and enhance sustainable agriculture.
NVIDIA is working with hundreds of startups that are employing AI to address climate challenges, and the company also announced plans for Earth-2, which it hopes will be the world’s most powerful AI supercomputer dedicated to climate science.
Improved energy efficiency across the stack
Since NVIDIA’s founding in 1993, we have been committed to improving energy efficiency across all of our products: GPUs, CPUs, DPUs, networks, systems, software and platforms like Omniverse.
In AI, inference is what drives AI models, providing insights that help users achieve new efficiencies. The NVIDIA GB200 Grace Blackwell superchip has demonstrated 25x greater energy efficiency for AI inference than the previous NVIDIA Hopper GPU generation.
Over the past eight years, NVIDIA GPUs have become 45,000 times more energy efficient when running large language models (see graph below).
Recent innovations in software include TensorRT-LLM, which enables GPUs to reduce energy consumption for LLM inference by 3x.
Here’s a startling statistic: if cars became as efficient as NVIDIA has made AI more efficient with its accelerated computing platforms, they would get 280,000 miles per gallon of gas — that is, you could drive to the moon on less than a gallon of gas.
Applying this analysis to automotive fuel efficiency, NVIDIA achieves a staggering 10,000x efficiency improvement in AI training and inference between 2016 and 2025 (see graph below).
Compare the massive gains in AI efficiency from NVIDIA P100 GPUs to NVIDIA Grace Blackwell to the fuel economy gains of automobiles.
Promoting data center efficiency
NVIDIA delivers many optimizations through system-level innovations, including the NVIDIA BlueField-3 DPU, which can reduce power consumption by up to 30% by offloading critical data center network and infrastructure functions from less efficient CPUs.
Last year, NVIDIA received a $5 million grant from the U.S. Department of Energy (the largest of 15 grants selected from more than 100 applications) to design a new liquid cooling technology for data centers that operates 20% more efficiently and produces less carbon dioxide emissions than current air-cooling methods.
These are just a few of the ways NVIDIA is helping to improve data center energy efficiency.
Data centers are among the most efficient users of energy and among the largest consumers of renewable energy.
According to the ITIF report, between 2010 and 2018, data centers worldwide saw a 550% increase in computing instances and a 2,400% increase in storage capacity, yet energy use only increased by 6% due to improvements in hardware and software across the board.
NVIDIA continues to improve the energy efficiency of accelerated AI, helping users in science, government and industry accelerate their efforts toward sustainable computing.
Try our energy efficiency calculator to find ways to improve your energy efficiency, and visit our Sustainable Computing site and corporate sustainability report for more information.