Futurology: The global demand for AI computing is causing data centers to guzzle electricity like dorm rooms guzzling beer, but researchers at the University of Minnesota may have a highly innovative solution to curb AI’s growing power demands with a revolutionary new device that promises far greater energy efficiency.
Researchers have designed a new “computational random access memory” (CRAM) prototype chip that could reduce the energy demands of AI applications by more than 1,000 times compared to current methods, with one simulation showing a staggering 2,500 times energy savings with the CRAM technology.
Traditional computing relies on the decades-old von Neumann architecture of separate processors and memory units, which require data to be constantly passed back and forth in an energy-intensive process. The University of Minnesota team’s CRAM completely upends this model by using a spintronics device called a magnetic tunnel junction (MTJ) to perform calculations directly in the memory itself.
Spintronic devices harness the spin of electrons rather than relying on electric charge to store data, offering a more efficient alternative to traditional transistor-based chips.
“CRAM is an extremely energy-efficient, digital-based, in-memory computing substrate that is extremely flexible in that computations can be performed anywhere in the memory array, so CRAM can be reconfigured to meet the performance needs of different AI algorithms,” said Ulya Karpuzcu, co-author of the Nature paper, who added that CRAM is more energy efficient than traditional building blocks for today’s AI systems.
By eliminating power-hungry data transfers between logic and memory, CRAM technology like this prototype could be crucial to dramatically improve AI energy efficiency at a time when energy demands are exploding.
The International Energy Agency predicted in March that global electricity consumption for AI training and applications could double from 460 terawatt-hours in 2022 to more than 1,000 terawatt-hours by 2026, roughly the amount consumed by the entire country of Japan.
The foundation for this breakthrough has been laid over more than 20 years, dating back to the pioneering work of Engineering professor Jian-Ping Wang on the use of MTJ nanodevices for computing, the researchers said in a press release.
Wang acknowledged that the original proposal to abandon the von Neumann model “was considered foolhardy” 20 years ago, but the University of Minnesota team persisted, building on Wang’s patented MTJ research that enabled the magnetic random access memory (MRAM) now used in smartwatches and other embedded systems.
Of course, as with any breakthrough of this nature, the researchers must address challenges such as scalability, manufacturing, and integration with existing silicon. They are already planning demonstration collaborations with semiconductor industry leaders to make CRAM commercially viable.