Another Sampling Technique (buttondown.com)

🤖 AI Summary
The post introduces Metropolis-Hastings (MH) as a compact, practical alternative to rejection sampling for drawing samples from a distribution known only up to a constant factor — e.g., sampling points from a grayscale image so darker pixels are sampled more often. The MH procedure is simple: from a current state propose a new state (the proposal can be chosen flexibly), then accept the proposal with probability roughly p(x)/p(c) (for symmetric proposals; the general MH ratio also includes proposal densities). Repeating this builds a Markov chain whose stationary distribution is the target distribution, so after burn‑in the generated points look like truthful draws from the unknown normalized density. Why it matters: MH (and MCMC more broadly) lets practitioners sample from complex, high‑dimensional or unnormalized distributions ubiquitous in Bayesian inference, generative modeling, and approximate inference — cases where direct sampling or normalization is infeasible. Key implications: proposal choice controls mixing and acceptance rates, you trade off efficiency for local exploration, and you must account for burn‑in and autocorrelation in samples. The image example illustrates how MH can reconstruct a distribution from sparse samples, demonstrating its power and generality across 1D, 2D or higher‑dimensional problems.
Loading comments...
loading comments...