Monte Carlo methods, Goodman, Fall 2007

Course Home Page

Register for
G63.2044.001 (Mathematics)
or
G22.2945.001 (Computer Science)
Tuesday from 5:10 pm to 7
Room 1302 Warren Weaver Hall
Starting September 4, 2007

Instructor

Jonathan Goodman
goodman@cims.nyu.edu
(212)998-3326
Office hours: Wednesday, 9-11, after class, or by appointment (call, email, or see me at class to set up).
Office: 617, Warren Weaver Hall, NYU

Description

This is an introduction to Monte Carlo methods and many of their applications. The class will discuss theory and practice and the interaction between the two.

The first half will be basics:

  • Simple Sampling
    • Random number generators
    • Simple sampling techniques: mapping, rejection
    • Sampling Gaussians: Box-Muller, Cholesky
    • Testing via histograms and other density estimators
    • Error bars
  • Dynamic sampling and Markov Chain Monte Carlo
    • Basic theory of discrete time Markov chains
    • Autocorrelation, the Kubo fomula, and error bars
    • Detailed balance and the Metropolis algorithm
    • Langevin and Hybrid samplers
  • Variance reduction techniques
    • Control variates
    • Antithetic variates and stratified sampling
    • Sample reweighting and moment matching
    • Importance sampling
    • Sensitivity analysis via reweighting (score function) and coupled sampling

Advanced topics depending on the interests of the class. Some possibilities are (we cannot cover all of these):

  • Stochastic differential equations
    • Path generation by Euler time stepping
    • Weak, strong, and MTV error estimates
    • Improved time stepping methods, Milstein and Runge Kutta
    • Multilevel simulation strategies
    • Accuracy of steady states
  • Quasi Monte Carlo and low discrepancy sequences
    • Standard sequences of Sobol, Faure, etc.
    • Brownian bridge construction
    • Adaptive strategies, eg. using principle component analysis
    • Reweighting and moment matching
    • Hybrid methods with high order quadrature and with true Monte Carlo
  • Rare event sampling
    • Large deviation theory -- the theory of rare events
    • Importance sampling strategies based on large deviation theory
    • Further variance reduction using detailed structure of rare event probability distributions
    • Methods that rely less on large deviation theory
  • Advanced topics in dynamic Monte Carlo
    • Analysis of convergence rates in model problems including Gaussians and one dimensional random walk
    • Collective modes and Multigrid Monte Carlo
    • Umbrella sampling and related ideas
    • Swendsen Wang algorithm
    • Cheeger's inequality and other estimates of the spectral gap
  • Monte Carlo optimization
    • Robbins Monro and descent algorithms
    • Methods for estimating sensitivities, particularly with Markov Chain Monte Carlo
    • Randomized search, including simulated annealing and genetic algorithms
    • Applications to machine learning
    • Applications to stochastic control and estimation

Work load and evaluation

The grade for the class will be based on:
  • Approximately six assignments that involve paper and pencil work as well as programming/computing. These will focus more, though not exclusively, on the earlier parts of the class to allow time for:
  • A group (or indivual if desired) project exploring some aspect of the class or application area in more depth. I will post suggested topics and help people find partners as necessary.

Communication

TBA, either a wiki on this site or a message board on the NYU Blackboard system.

Prerequisites

Students should be very familiar with basic probability at the level of Basic Probability and with linear algebra at the level of Linear Algebra. It would be helpful to have taken Scientific Computing, but students at least should be able to program for scientific computing applications.

Source material

I will mostly work from lecture notes that will be posted. I also recommend the books by Hammersley and Handscomb and by Kalos and Whitlock, which are on reserve in the Courant library.

To be posted as they are revised from last time.

Homework

Notes