Information Theory, Predictability and Disequilibrium
Instructor: Prof. Richard Kleeman (Office: 929 Warren Weaver)
Location: 905 Warren Weaver
Time: Thursday 11:00-12:50pm, Fall 2017.
Text: Cover and Thomas: Elements of Information Theory (Wiley). First or Second Editions (1990 or 2006). The following review paper
Assessment: Attendance only. This is a seminar course.
There will be 11 lectures. The contents are described briefly below. Relatively complete lecture notes as pdf files are linked to below.
Introduction. Overview of Applications. Basic axiomatic derivation following Shannon. Introduction to the information content of codes. Lecture Notes.
Entropic functionals and their properties. Lecture Notes.
Stochastic Processes. Lecture Notes.
Data Compression. Lecture Notes.
Differential Entropy. The limiting process and coarse graining. Invariance properties. Lecture Notes.
Maximum entropy and statistical mechanics. Lecture Notes.
Gaussian special case. Lecture Notes.
Dynamical system statistical prediction. Introduction and commonly used practical methodologies. Lecture Notes.
Theoretical predictability concepts. Lyapunov exponents and their relation to information theory and predictability. An information theory framework for studying predictability. Lecture Notes.
Application of information theoretical techniques to a variety of simple but physically relevant dynamical systems. Lecture Notes.
A new information theoretical approach to disequilibriated statistical systems. This describes recent research by the instructor in statistical physics and predictability theory.