Information Theory, Predictability and Disequilibrium
(MATH-GA 2830.002)

 

 

Instructor: Prof. Richard Kleeman (Office: 929 Warren Weaver)
Location: 905 Warren Weaver
Time: Thursday 11:00-12:50pm, Fall 2017.
Text:  Cover and Thomas: Elements of Information Theory (Wiley). First or Second Editions (1990 or 2006). The following review paper
Assessment: Attendance only. This is a seminar course.


Syllabus

There will be 11 lectures. The contents are described briefly below. Relatively complete lecture notes as pdf files are linked to below.

Lecture 1
Introduction. Overview of Applications. Basic axiomatic derivation following Shannon. Introduction to the information content of codes. Lecture Notes.

Lecture 2
Entropic functionals and their properties. Lecture Notes.

Lecture 3
Stochastic Processes. Lecture Notes.

Lecture 4
Data Compression. Lecture Notes.

Lecture 5
Differential Entropy. The limiting process and coarse graining. Invariance properties. Lecture Notes.

Lecture 6
Maximum entropy and statistical mechanics. Lecture Notes.

Lecture 7
Gaussian special case. Lecture Notes.

Lecture 8
Dynamical system statistical prediction. Introduction and commonly used practical methodologies. Lecture Notes.

Lecture 9
Theoretical predictability concepts. Lyapunov exponents and their relation to information theory and predictability. An information theory framework for studying predictability. Lecture Notes.

Lecture 10
Application of information theoretical techniques to a variety of simple but physically relevant dynamical systems. Lecture Notes.

Lecture 11

A new information theoretical approach to disequilibriated statistical systems. This describes recent research by the instructor in statistical physics and predictability theory.

Lecture Notes.