Probability and Statistics 2 (NMAI073)
Instructor: Robert Šámal
- Coordinates
-
Lecture: Tue 15:40 in S4.
Tutorial: Thu 17:20 in S7.
Homeworks: in the Postal Owl
- Syllabus
-
See info in SIS.
- Last year version
-
To get a rough idea about the class, you may see the
Moodle page of last year
with all materials. (There will be some changes though.)
- Literature
-
See info in SIS.
Lecture notes (rough first version) (last update: Nov 10)
Personal notes of Jonáš Havelka -- in Czech only, no guarantee from me, nor from the author. It looks helpful though, for a review etc.
- Log of lectures
-
- Lecture 1 -- Oct 4, 2022
-
Definition of a Markov chain, examples. Multistep transition probabilities (Kolmogorov-Chapman recursion formula).
Accesible states, comminicating states.
video (sorry for the low resolution)
- Lecture 2 -- Oct 11, 2022
-
Classification of states: recurrent, transient, periodic.
Theorem about convergence to stationary distribution.
video
- No Lecture on Oct 18, 2022
-
Instead, see the tutorial web page for some study materials.
- Lecture 3 -- Oct 25, 2022
-
Probability of absorption, time to absorption.
Application: probabilistic algorithm for 2-SAT
and 3-SAT (to be finished next time).
video
- Lecture 4 -- Nov 1, 2022
-
Randomized algo for 3-SAT.
Intro to Baysian statistics: basic principles of Bayesian inference,
MAP and LMS methods to get point estimates.
video
youtube video on Bayesian principles
- Lecture 5 -- Nov 8, 2022
-
Baysian statistics -- example computations. (Conjugate gradients.)
video
- Lecture 6 -- Nov 15, 2022
-
Bayesian statistics -- final comments (MCMC).
Conditional expectation (as a random variable) -- Law of iterated expectation, Eve's rule for variance.
video
- Lecture 7 -- Nov 22, 2022
-
Bernoulli and Poisson processes.
video
- Lecture 8 -- Nov 29, 2022
-
More on Poisson processes -- splitting, merging.
video
- Lecture 9 -- Dec 6, 2022
-
Balls&bins: birthday paradox, estimate of maxload, applications (bucketsort and hashing).
There is a small mistake in a proof of the max-load estimate.
I claimed that $P(\mbox{number of balls in the first bin} \ge M) \le P(Bin(n,1/n) = M)$.
This is false. However, the estimate
$$P(\mbox{number of balls in the first bin} \ge M) \le \binom{n}{M} \frac{1}{n^M}$$ is true for the following reason:
if there are $\ge M$ balls in the first bin, then there is a particular $M$-tuple that went into this bin
(and we keep silent about what happens to the rest). The probability of this is $\frac{1}{n^M}$ for every
$M$-tuple, so it is enough to use the union bound.
Sorry about that, the other proof seemed to me easier to understand, except, well, it is wrong :(.
video
calculator for the birthday paradox
- Lecture 10 -- Dec 13, 2022
-
Balls&bins: Poisson approximation.
Permutation test.
video
The animation I showed comes from
wiki page on Permutation tests.
A nice visual explanation"
- Lecture 11 -- Dec 20, 2022
-
Nonparametric statistics: Permutation test, Sign test, Wilcoxon's signed rank test, Mann-Whitney U-test.
Simpson's paradox & cosequences for design of statistical experiments.
video
- Lecture 12 -- Jan 3, 2023
-
Central limit theorem -- (sketch of) proof using moment generating functions.
Proof of Chernoff bound.
Shannon Source Coding theorem & entropy.
video