The essence of statistical inference:
Given observed data , we want to learn about the unknown parameters that generated it.

Two fundamental quantities:

  • Likelihood - probability of observing data given parameters
  • Posterior - probability of parameters given observed data

The posterior is what we’re after - it tells us what parameter values are plausible given our observations. We compute it using Bayes Theorem: posterior likelihood prior.

Statistics is all about building models and testing them, separating signal from noise.

References

mathematics