Bayesian Inference
Treats all parameters 1 as random variables with a prior distribution .
Update via Bayes’ rule:The posterior represents our updated beliefs about parameters after seeing data.
Computing it often requires intractable integrals (for the marginal likelihood ), leading to approximation methods:
- Variational inference - approximate posterior with simpler distribution
- Markov chain Monte Carlo - sample from posterior distribution
- Laplace approximation - Gaussian approximation around posterior mode
bayesianism
bayesian neural network
Footnotes
-
Here “parameters” includes all unobserved quantities, not just model parameters, but also latent variables, or hyperparameters. In practice, some may be kept fixed/point-estimated rather than given priors, depending on the model and computational considerations. ↩