Room 101 Warren Weaver Hall

Starting September 6, 2005

Office: 617, Warren Weaver Hall, NYU

Monte Carlo is a computational technique with a large and growing range of applications. What we now call Monte Carlo has two deep roots, computational physics and "simulation" in operations research. Today, Monte Carlo is having big impacts in biology and genomics, finance, statistics, atmospheric modeling, and in many other areas. Surprising and effective new algorithims have appeared.

I'm planning a class for people with no previous experience with Monte Carlo who want to use it in some application. I hope it is possible to start from the beginning and still get someplace interesting at the edge of where people know what to do. Exactly what route we take may depend on the interests of people in the class.

The course will have two parts. The first is an introduction to core Monte Carlo technique including sampling, error bars, and basic variance reduction. Each of these will include the classical material such as rejection sampling, central limit theorem error bars and control variate variance reduction. I will inject a few modern elements:

**Sampling**: Markov chain sampling such as Metropolis.**Error bars**: Autocorrelation time estimation and tricks for faster Markov chain convergence.**Variance reduction**: Using large deviation theory in rare event simulation, low discrepency sequences such as Sobol as generalizations of stratified sampling.

I'm thinking that the second part of the course will be Monte Carlo perturbation theory (sensitivity analysis) and stochastic optimization. This is important in finance and engineering applications where one seeks optimal policies or controls for stochastic systems. It also is important in biological and statistical applications where one seeks to maximize a likelihood function that can be evaluated only by Monte Carlo. Finally, many advanced Monte Carlo strategies have parameters that can be optimized to improve the performance of the Monte Carlo algorithm. The relation between these topics is that optimization depends on sensitivity analysis. Naive sensitivity analysis based on finite differences of Monte Carlo calculations is very noisy. Variance reduction ideas lead to sophisticated ways to estimate sensitivity that are less noisy and more accurate and give better optimization methods. Dynamic Monte Carlo makes these methods possible.

- Six assignments that involve paper and pencil work as well as programming/computing. These will focus more, though not exclusively, on the earlier parts of the class to allow time for:
- A group (or indivual if desired) project exploring some aspect of the class or application area in more depth. I will post suggested topics and help people find partners as necessary.
- A final exam during finals week.

- Introduction
- Simple sampling
- Sampling Gaussians
- Variance reduction, with Matlab files CV1.m and CV3.m.
- Dynamnic sampling