Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip
#1
Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip
By Michelle Hawley October 30, 2025

[Image: 5e49e04e2396476b886de19e162df1a7.ashx]

A new probabilistic computing approach from Extropic promises radically lower energy use for generative AI, challenging GPU-based AI infrastructure.
Key Takeaways:
  1. Extropic claims its new “probabilistic” hardware could run GenAI using far less energy than GPUs.
  2. Simulations show ~10,000x energy savings with its new Denoising Thermodynamic Model (DTM).
  3. Extropic said it plans to remove power constraints that limit AI scaling today.

AI Maxes Out the Power Grid 
The AI boom has come with a physical constraint most consumers never see: electricity.

Data centers across the world struggle to secure enough power to support AI training and inference. Three years ago, tech startup Extropic bet that energy — not chips, not data — would become the primary limit to AI scaling. In their latest announcement, they say that bet has proven correct.

Rather than work on energy generation, which would require major infrastructure and government support, Extropic targeted another side of the problem: how to make AI itself more energy efficient.

Extropic Introduces First Scalable Probabilistic Computer
Modern AI is built on GPUs, a type of processor originally designed to render graphics. GPUs evolved into AI accelerators because they are good at matrix multiplication, the core mathematical operation behind neural networks. But GPUs are not energy efficient, and most of their power consumption goes into moving information around the chip, not the math itself.

Extropic claims to have designed an alternative: a new class of AI chip — a scalable probabilistic computer — built for sampling probability directly instead of performing GPU-style matrix math.

According to the company, their hardware:
  1. Uses “orders of magnitude” less energy than GPUs
  2. Performs AI tasks by sampling probability, not crunching large matrices
  3. Was fabricated and tested in silicon
  4. Runs a new kind of generative AI algorithm
This new device is called the Thermodynamic Sampling Unit (TSU).

How a TSU Works
TSUs function as probabilistic AI chips. Most AI chips today, including GPUs and TPUs, perform massive matrix multiplications to estimate probabilities and then sample from them. Extropic’s hardware claims to skip the matrix multiplication entirely and directly sample from complex distributions.

Key Claims About TSUs
Extropic states that TSUs:

Are built from large arrays of probabilistic cores
Sample from energy-based models (EBMs), a class of machine learning (ML) models
Use the Gibbs sampling algorithm to combine many simple probabilistic circuits into complex distributions
Minimize energy by keeping communication strictly local — circuits only interact with nearby neighbors
This last point is critical. Extropic argued that the biggest energy drain in GPUs is data movement. By designing hardware where communication is entirely local, the TSU architecture avoids expensive long-distance wiring and voltage changes within the chip.

In other words: TSUs are built to be physically, and therefore energetically, optimized for probability, not arithmetic.

How TSUs Compare to AI Chips
GPUs/TPUs: Deterministic math engines optimized for matrix multiplication
TSUs: Probabilistic chips that generate samples directly
pbits: Transistor-based probabilistic bits that fluctuate between 0 and 1
Goal: Deliver generative AI using far less energy than GPU-based systems
The Smallest Building Block: The pbit
At the core of the TSU is what Extropic calls a pbit.

A traditional digital bit is always a 1 or a 0
A pbit fluctuates randomly between 1 and 0
The probability of being in either state is programmable
This makes a pbit essentially a hardware random number generator.

A single pbit is not very useful. But, as Extropic noted, neither is a single NAND gate. Combine enough of them, and you get a functioning computer.

Extropic claims that:

Existing academic pbit designs were not commercially viable because they required exotic components
Extropic designed a pbit built entirely from transistors
Its pbits use orders of magnitude less energy to generate randomness
A hardware “proof of technology” has already validated the concept
Because pbits are small and energy-efficient, they can be packed tightly into a TSU. And because they are made from ordinary transistors, they can be integrated alongside standard computing circuitry.

A New Generative AI Model: The Denoising Thermodynamic Model
To show how their hardware can be used in real applications, Extropic also developed a new generative AI algorithm called the Denoising Thermodynamic Model (DTM).

DTMs are inspired by diffusion models, the same broad family used by image generators like Stable Diffusion. Like diffusion, a DTM starts with noise and iteratively transforms it into structured output.

However, Extropic states that DTMs are designed specifically for TSUs and are therefore far more energy-efficient.

[Image: 767c1563cbbc4338bf0fa79472ed04cc.ashx]

According to Extropic:
  1. Simulations of DTMs running on TSUs could be 10,000x more energy-efficient than modern algorithms running on GPUs
  2. Results can be replicated using thrml, their open-source Python library
  3. Why Extropic's Breakthrough Matters
  4. Extropic framed the problem in simple terms: the world does not have enough power for unlimited AI.

The Bottleneck
  • Every major AI model increases compute requirements
  • Every increase in compute increases energy demand
  • Data centers are already struggling to secure power
  • If generative AI were served to billions of users continuously — at scale similar to email or search — today’s hardware could consume more energy than the world currently produces.

The Proposed Solution
Improve energy generation and reduce computing energy consumption, removing the energy ceiling preventing widespread, always-on AI.

Extropic:
https://extropic.ai/

https://www.vktr.com/ai-news/extropic-cl...c-ai-chip/


Forum Jump:


Users browsing this thread: 1 Guest(s)