Bayesian statistics -- Normal prior and likelihood: another example of conjugate gradients.
Explaining continuous version of Bayes theorem.
Comparison with numerical simulations using PyMC:
html and jupyter notebook.
Explaining how to sample from the posterior distribution: Metropolis-Hastings algorithm
does not require us to compute the scaling factor!
Proof that LMS estimate is given by the conditional expectation. Conditional independence.
Finally, a little Haiku to summarize Bayesian approach:
Priors meet data,
Evidence refines belief—
Knowledge unfolds.
video