49 KAM Mathematical Colloquium

ENTROPY GROWTH FOR SUMS OF INDEPENDENT RANDOM VARIABLES

December 16, 2003 12:30 Lecture hall S5

Abstract

Entropy plays a central role in the analysis of many random and physical systems: the entropy measures the disorder of such a system. For natural'' processes we expect entropy to grow as the process evolves. The most natural random process is the sequence of normalised sums
$\frac{1}{\sqrt{n}} \sum_1^n X_i$

of independent identically distributed square-integrable random variables, which converges in distribution to a Gaussian (normal). It was proved in the late 1940's by Shannon that the entropy for $n=2$ is at least as large as that for $n=1$ but there seemed to be no way to determine whether the entropy continues to increase. In this lecture I will briefly introduce the concept of entropy and explain a new method for analysing entropy production which resolves the problem left by Shannon and has also been used to obtain the first quantitative results on entropy production for a wide class of random variables. No knowledge beyond elementary probability and vector calculus will be assumed.

THERE ARE INFINITELY MANY IRRATIONAL VALUES OF ZETA AT THE ODD INTEGERS

December 16, 2003 12:30 Lecture hall S5

Abstract

The zeta function is defined, for $s>1$ by
\zeta(s) = \sum_{n=1}^\infty \frac{1}{n^s}.

It is universally believed that its values at positive integers ($s>1$) are irrational (and even transcendental). This is known for even integers because the values are rational multiples of appropriate powers of $\pi$. For the odd integers, only $\zeta(3)$ is known to be irrational: this was shown by Ap\'ery in 1978. This talk will describe an elementary construction found by the author and T. Rivoal which suffices to show that infinitely many of the odd-number values are irrational. Nothing will be assumed beyond a first course in complex analysis.