![]() |
|
Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip - Printable Version +- Tech Press Releases (https://techpressreleases.io/press-releases) +-- Forum: The Press Releases (https://techpressreleases.io/press-releases/forumdisplay.php?fid=3) +--- Forum: 2020 (https://techpressreleases.io/press-releases/forumdisplay.php?fid=4) +---- Forum: 2025 (https://techpressreleases.io/press-releases/forumdisplay.php?fid=5) +---- Thread: Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip (/showthread.php?tid=14) |
Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip - jasongeek - 12-28-2025 Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip By Michelle Hawley October 30, 2025 A new probabilistic computing approach from Extropic promises radically lower energy use for generative AI, challenging GPU-based AI infrastructure. Key Takeaways:
AI Maxes Out the Power Grid The AI boom has come with a physical constraint most consumers never see: electricity. Data centers across the world struggle to secure enough power to support AI training and inference. Three years ago, tech startup Extropic bet that energy — not chips, not data — would become the primary limit to AI scaling. In their latest announcement, they say that bet has proven correct. Rather than work on energy generation, which would require major infrastructure and government support, Extropic targeted another side of the problem: how to make AI itself more energy efficient. Extropic Introduces First Scalable Probabilistic Computer Modern AI is built on GPUs, a type of processor originally designed to render graphics. GPUs evolved into AI accelerators because they are good at matrix multiplication, the core mathematical operation behind neural networks. But GPUs are not energy efficient, and most of their power consumption goes into moving information around the chip, not the math itself. Extropic claims to have designed an alternative: a new class of AI chip — a scalable probabilistic computer — built for sampling probability directly instead of performing GPU-style matrix math. According to the company, their hardware:
How a TSU Works TSUs function as probabilistic AI chips. Most AI chips today, including GPUs and TPUs, perform massive matrix multiplications to estimate probabilities and then sample from them. Extropic’s hardware claims to skip the matrix multiplication entirely and directly sample from complex distributions. Key Claims About TSUs Extropic states that TSUs: Are built from large arrays of probabilistic cores Sample from energy-based models (EBMs), a class of machine learning (ML) models Use the Gibbs sampling algorithm to combine many simple probabilistic circuits into complex distributions Minimize energy by keeping communication strictly local — circuits only interact with nearby neighbors This last point is critical. Extropic argued that the biggest energy drain in GPUs is data movement. By designing hardware where communication is entirely local, the TSU architecture avoids expensive long-distance wiring and voltage changes within the chip. In other words: TSUs are built to be physically, and therefore energetically, optimized for probability, not arithmetic. How TSUs Compare to AI Chips GPUs/TPUs: Deterministic math engines optimized for matrix multiplication TSUs: Probabilistic chips that generate samples directly pbits: Transistor-based probabilistic bits that fluctuate between 0 and 1 Goal: Deliver generative AI using far less energy than GPU-based systems The Smallest Building Block: The pbit At the core of the TSU is what Extropic calls a pbit. A traditional digital bit is always a 1 or a 0 A pbit fluctuates randomly between 1 and 0 The probability of being in either state is programmable This makes a pbit essentially a hardware random number generator. A single pbit is not very useful. But, as Extropic noted, neither is a single NAND gate. Combine enough of them, and you get a functioning computer. Extropic claims that: Existing academic pbit designs were not commercially viable because they required exotic components Extropic designed a pbit built entirely from transistors Its pbits use orders of magnitude less energy to generate randomness A hardware “proof of technology” has already validated the concept Because pbits are small and energy-efficient, they can be packed tightly into a TSU. And because they are made from ordinary transistors, they can be integrated alongside standard computing circuitry. A New Generative AI Model: The Denoising Thermodynamic Model To show how their hardware can be used in real applications, Extropic also developed a new generative AI algorithm called the Denoising Thermodynamic Model (DTM). DTMs are inspired by diffusion models, the same broad family used by image generators like Stable Diffusion. Like diffusion, a DTM starts with noise and iteratively transforms it into structured output. However, Extropic states that DTMs are designed specifically for TSUs and are therefore far more energy-efficient. According to Extropic:
The Bottleneck
The Proposed Solution Improve energy generation and reduce computing energy consumption, removing the energy ceiling preventing widespread, always-on AI. Extropic: https://extropic.ai/ https://www.vktr.com/ai-news/extropic-claims-10000x-energy-savings-with-new-probabilistic-ai-chip/ |