Demand for computing power—fueled largely by artificial intelligence (AI)—shows no signs of abating. Yet, as AI becomes a potent force in the world, a basic truth emerges: CPUs and GPUs are butting up against the limits of Moore’s Law, and the amount of energy they consume is unsustainable.
One promising method for cutting wasted energy is thermodynamic computing. The approach, which harnesses the natural stochastic behavior of physical systems—typically heat or electrical noise—taps into the fundamental laws of thermodynamics to decrease power consumption. Instead of trying to suppress thermal interference, it is harnessed as a computing resource.
“You harness stochastic thermodynamics, also known as ‘out of equilibrium thermodynamics,’ to execute certain algorithms,” explained Guillaume Verdon, founder and CEO of Extropic, a company developing chips that utilize thermodynamic fluctuations. This results in practical gains of about 100x for niche applications and up to 10,000x in specific instances. “It’s a way to densify intelligence without adding more compute power,” he added.
Despite enormous promise, thermodynamic computing remains in its infancy. “The main problem is how to create a complete and practical solution that applies to the real world,” said David Atienza Alonso, a professor of Electrical and Computer Engineering, head of the Embedded Systems Laboratory (ESL), and associate vice president for Research Centers and Technology Platforms at the Swiss Federal Institute of Technology in Lausanne (EPFL) .
Out of Energy
Thermodynamic computing operates on a straightforward principle. Conventional digital computation systems “cost energy” because they are designed to suppress stochasticity (the quality of lacking any predictable order or plan), said Gavin Crooks, staff research scientist at New York City-based Normal Computing. On the other hand, Crooks said, “Thermodynamic computing harnesses these inherent fluctuations. It reduces energy demand by embracing probabilistic rather than deterministic operations.”
Thermodynamics can take several forms in the computing space. Some systems are inspired by the brain (neuromorphic), others focus on conserving energy through slow, reversible changes (adiabatic quantum computing and quantum annealing), while others draw from biological processes. Among the more promising approaches: silicon chips that use natural electrical noise to power probabilistic bits. These circuits—stochastic oscillator networks—use random motion to solve math problems more efficiently.
For example, Extropic’s focus is on electron-level stochastic physics in AI inference and Monte Carlo algorithms. It has engineered a CMOS chip, currently a prototype, that integrates with standard CPUs and GPUs. Designed to fit on a conventional computer board, the device is especially suited for robotics, edge computing applications, and computational biology, where probabilistic processing is needed.
“AI is naturally amenable to being executed on a stochastic device, because AI is all about shaping probability distributions to mimic either data or some sort of target distribution,” Verdon explained.
Feeling the Heat
Another company racing into the thermodynamic computing space is Normal Computing. Crooks and his team are developing conventional silicon technology that detects subtle harmonic oscillations that result from natural vibrations in electrical circuits.
Normal Computing is developing a chip that incorporates tiny electrical components that wiggle and bounce like interconnected springs. As they move, they’re able to perform complex tasks, such as matrix inversion by sampling from Gaussian (bell-shaped) distributions. Rather than relying on precise digital operations, the system harnesses the physics of motion and randomness to solve problems.
The approach hinges on a key tenet of physics: naturally occurring movements follow repeatable patterns. As a result, the computer can make smart guesses and solve complicated math problems—such as flipping a large table of numbers (inverting a matrix)—using much less energy. “The idea is to use physics to remove layers of abstraction that lead to inefficiencies and wasted energy,” Crooks said.
Other research groups and startups are exploring other ways to take advantage of thermodynamics. These include incorporating memristors, magnetic tunnel junctions, and even superconducting circuits in systems.
Computing systems that use stochastic thermodynamics will likely revolve around three approaches, said David Wolpert, a physicist and professor at New Mexico’s Santa Fe Institute. At first, hybrid devices will add a chip that works alongside a CPU or GPU. Later, more advanced systems will integrate stochastic thermodynamics directly into foundational CMOS technology. “If you can embed the technology into CMOS, you’re able to get to a far more precise and accurate framework,” Wolpert said. Finally, the technology could leap beyond CMOS to another technology.
The New Dynamics
Researchers have only begun to understand how to apply thermodynamics to computing. This includes finding ways to manage computations in the most economical way, which may include running processors and processes slightly slower at times. “It’s in a very early stage, comparable to the invention of the transistor,” Wolpert said. “Achieving the full potential of the technology will likely take decades.”
An irony, Verdon noted, is that highly efficient thermodynamic systems are already within reach—but impractical for the real world. “A superconductor could be the most efficient thermodynamic system imaginable, but it has to be super-cooled—and you have to produce it at mass scale; you can’t put it in phones and other devices. So, for now, a separate CMOS chip is the only realistic option.”
Other obstacles remain. “There is a lot of work needed in heuristics, algorithms, and chip design,” Wolpert said. This includes optimizing designs for real-world efficiency and manufacturing thermodynamic chips and systems at scale. For now, transitioning from proof of concept to commercial viability remains difficult, expensive, and resource-intensive, Verdon added.
There also are challenges related to optimizing thermodynamic performance. While existing systems can reduce energy consumption in niche situations, “Scaling this technology to the level of LLMs and other large AI frameworks while maintaining computing performance remains a challenge,” Atienza cautioned.
Crooks said it’s necessary to rethink how algorithms work and how they execute actions on hardware. “The software we use today is a result of hardware that supports specific functions,” he noted. For example, “GPUs have allowed AI to flourish.” Getting hardware and thermodynamic algorithms to work together remains a work in progress. “It’s especially challenging because new hardware typically changes the noise profile,” he noted.
Physics Matters
The first commercial systems will likely combine conventional deterministic processing with thermodynamic computing modules. Over time, more advanced thermodynamic frameworks will likely “take over significant portions of computational workloads,” Verdon said.
This could unlock enormous gains for energy-intensive applications like AI, digital twins. and probabilistic problem-solving, Atienza explained. It could also benefit many other applications, such as cyber-physical systems, where peak performance is not a key constraint but low power is essential.
Ultimately, the question isn’t whether thermodynamic computing will make an impact, but when, Crooks said. “Our appetite for compute is virtually unlimited. Thermodynamic computing provides a toolset for transitioning to a more energy efficient and sustainable form of computing.”
Samuel Greengard is an author and journalist based in West Linn, OR, USA.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment