🤖 AI Summary
Extropic announced the TSU (thermodynamic sampling unit), a new class of probabilistic accelerator built on standard semiconductor processes that natively samples from probability distributions. Rather than performing matrix-heavy computations like GPUs, a TSU is a network of programmable probabilistic circuits (pbits) that output stochastic 0/1 samples according to a control voltage. By wiring many pbits into probabilistic graphical models (PGMs) and running Gibbs sampling in hardware, each node maps to a sampling cell, edges become wires, and local resistor networks compute the conditional bias used to drive pbits and update state registers.
The significance is twofold: generative AI fundamentally reduces to sampling from learned distributions, and TSUs execute sampling directly—avoiding expensive normalization constants and potentially offering large energy-efficiency gains compared with GPU-optimized inference. Key technical details include the use of pbits as Bernoulli samplers, block Gibbs updates on bipartite graphs to enable parallelism without slowing iteration time, and a straightforward path to support multi-category discrete variables. The announcement implies a shift toward algorithm-hardware co-design: existing models were shaped to fit GPUs, but TSUs could unlock new, more sampling-centric algorithms and much lower-power generative systems as both hardware and algorithms evolve.
Loading comments...
login to comment
loading comments...
no comments yet