Bayesian Statistics
Bayes' Theorem
Prior & Posterior
| Property | Statement |
|---|---|
| Prior P(θ) | Belief about θ before seeing data |
| Likelihood P(x|θ) | Probability of data given θ |
| Posterior P(θ|x) | Updated belief after seeing data |
Examples
Example 1. Disease prevalence 1%. Test sensitivity 99%, specificity 95%. Positive test — P(disease)?
Solution. P(D|+) = (0.99·0.01)/(0.99·0.01+0.05·0.99) ≈ 16.7%.
Background
Bayesian statistics treats probability as a degree of belief and updates beliefs as evidence accumulates. The prior distribution \(P(\theta)\) encodes knowledge about the parameter \(\theta\) before seeing data. The likelihood \(P(x|\theta)\) describes how probable the observed data are for each value of \(\theta\). Bayes' theorem combines them: \(P(\theta|x)\propto P(x|\theta)P(\theta)\).
The posterior distribution \(P(\theta|x)\) is the complete Bayesian answer — it is a full probability distribution over \(\theta\), not just a point estimate. Point summaries include the posterior mean, median, and mode (MAP estimate). Credible intervals (e.g., the central 95% of the posterior) have a direct probability interpretation: there is a 95% probability that \(\theta\) lies in the interval, given the data.
Conjugate priors simplify computation: when the prior and posterior belong to the same family, the posterior has a closed form. Beta–Binomial, Normal–Normal, and Gamma–Poisson are classic conjugate pairs.
Markov Chain Monte Carlo (MCMC) methods — Metropolis–Hastings, Gibbs sampling, Hamiltonian Monte Carlo — enable Bayesian inference for complex models where the posterior has no closed form. Modern probabilistic programming languages (Stan, PyMC) automate MCMC, making Bayesian methods accessible for practical data analysis.
Further Reading & Context
The study of bayesian statistics connects to many areas of mathematics and its applications. Understanding the foundational definitions and theorems provides the basis for advanced work in analysis, algebra, and applied mathematics.
Historical development: most mathematical concepts evolved over centuries, with contributions from mathematicians across many cultures. The modern axiomatic treatment provides rigor, while computational tools enable practical application.
In modern mathematics, this topic appears in graduate courses and research across pure and applied mathematics. Connections to computer science, physics, and engineering make it a versatile and important area of study. Mastery of the core results and techniques opens doors to research in number theory, analysis, geometry, and beyond.
Recommended next steps: work through the standard theorems with full proofs, explore the connections to related topics listed above, and practice with a variety of problems ranging from computational exercises to theoretical proofs. The interplay between different areas of mathematics is one of the subject's greatest rewards.
Deep Dive: Bayesian Statistics
This lesson extends core ideas for bayesian statistics with rigorous reasoning, edge-case checks, and application framing in statistics.
Practice Set
Practice. Derive one main result on this page and validate with a numeric or geometric check.
Goal. Confirm assumptions, transformation steps, and final interpretation.
References & Editorial Notes
- Stewart, Calculus.
- Strang, Introduction to Linear Algebra.
- Apostol, Mathematical Analysis.
Last editorial review: 2026-04-14.