Probability and Statistics 2 (NMAI073)

Instructor: Robert Šámal

Coordinates
Lecture: Mon 10:40 in S3.
Tutorial run by Josef Tkadlec: Wed 9:00 (S8) and 10:40 S10.
Syllabus
See info in SIS.
Last year version
To get a rough idea about the class, you may see the last year's version with all materials. (There will be some changes though.)
Exam
Will be organized similarly as in Prob&Stat1: written test from both computing problems and theoretical understanding. It will be graded 1-5. After that, you will have a possibility to take oral exam to improve by one grade.
If you solve more homework problems that the required 50 %, you will be getting extra points for the exam test (up to 10 extra points if you get full score for homework problems).
Literature
See info in SIS.
Lecture notes (version Jan 29)

Personal notes of Jonáš Havelka from a previous version -- in Czech only, no guarantee from me, nor from the author. It looks helpful though, for a review etc.
Log of lectures
Lecture 1 -- Sep 30, 2024
Definition of a Markov chain, examples. Multistep transition probabilities (Kolmogorov-Chapman recursion formula).
video
Illustration of Kolmogorov-Chapman formula: Jupyter notebook shown in class and its html version.
Lecture 2 -- Oct 7, 2024
Communicating states, equivalence relation. Classification of states: recurrent, transient. Example: RW on a line. State 0 (as well as all others) is recurrent (with computation done in class). (To show this we use the following theorem we have not proved: a state $i$ is recurrent, if $\sum_{n=1}^\infty r_{i,i}(n) = \infty$.) However, the expected return time is infinite! (Not proved.)
In a finite MC, a communicating class is recurrent iff it is closed.
Theorem about convergence to stationary distribution. (just stated and conditions explained)
video (got stopped for a few seconds due to full harddisk, but it seems nothing important is missing)
Lecture 3 -- Oct 14, 2024
Discussion of the theorem about convergence. Application for generating random objects (MCMC, Matropolis-Hastings algorithm). Probability of absorption, time to absorption. Example: time to absorption on a path with one end absorbing.
video
Lecture 4 -- Oct 21, 2024
Randomized algo for 2-SAT (with hints about 3-SAT). Final comments about Markov chains: HMM. What is probability: Frequentist vs. Bayesian approach, subjective probability.
video
youtube video on Bayesian principles
Lecture 5 -- Nov 4, 2024
Intro to Baysian statistics: basic principles of Bayesian inference (prior, likelihood, posterior). MAP and LMS methods to get point estimates. (Computations just with MAP so far.) Examples: Naive Bayes spam classifier, coin tossing with Beta prior.
video
Lecture 6 -- Nov 11, 2024
Bayesian statistics -- Normal prior and likelihood: another example of conjugate gradients. Explaining continuous version of Bayes theorem. Comparison with numerical simulations using PyMC: html and jupyter notebook. Explaining how to sample from the posterior distribution: Metropolis-Hastings algorithm does not require us to compute the scaling factor! Proof that LMS estimate is given by the conditional expectation. Conditional independence.
Finally, a little Haiku to summarize Bayesian approach:

Priors meet data,
Evidence refines belief—
Knowledge unfolds.


video
Lecture 7 -- Nov 18, 2024
Conditional expectation (as a random variable) -- Law of iterated expectation cond. exp. as an estimator, Eve's rule for variance.
Birthday paradox.
video