Information Theory and Predictability
(MATH-GA 3011.001)



Instructor: Prof. Richard Kleeman (Office: 929 Warren Weaver)
Location: 1314 Warren Weaver
Time: Tuesday 1:25-3:15pm, Spring 2012.
Text:  Cover and Thomas: Elements of Information Theory (Wiley). First or Second Editions (1990 or 2006). The following review paper
Assessment: Attendance only. This is a seminar course.


Syllabus

There will be 11 lectures. The contents are described briefly below. Relatively complete lecture notes as pdf files are linked to below.

Lecture 1
Introduction. Overview of Applications. Basic axiomatic derivation following Shannon. Introduction to the information content of codes. Lecture Notes.

Lecture 2
Entropic functionals and their properties. Lecture Notes.

Lecture 3
Stochastic Processes. Lecture Notes.

Lecture 4
Data Compression. Lecture Notes.

Lecture 5
Differential Entropy. The limiting process and coarse graining. Invariance properties. Lecture Notes.

Lecture 6
Maximum entropy and statistical mechanics. Lecture Notes.

Lecture 7
Gaussian special case. Lecture Notes.

Lecture 8
Dynamical system statistical prediction. Introduction and commonly used practical methodologies. Lecture Notes.

Lecture 9
Theoretical predictability concepts. Lyapunov exponents and their relation to information theory and predictability. An information theory framework for studying predictability. Lecture Notes.

Lecture 10
Application of information theoretical techniques to a variety of simple but physically relevant dynamical systems. Lecture Notes.

Lecture 11
Information transfer. Empirical and formal approaches. Application to weather prediction. Lecture Notes.