Applied Math Seminar (AMS)

Random walks and PDEs in graph-based learning

Speaker: Jeff Calder, U. Minnesota (host: Kohn)

Location: TBA

Date: Friday, April 9, 2021, 2:30 p.m.

Synopsis:

I will discuss some applications of random walks and PDEs in graph-based learning, both for theoretical analysis and algorithm development. Graph-based learning is a field within machine learning that uses similarities between datapoints to create efficient representations of high-dimensional data for tasks like semi-supervised classification, clustering and dimension reduction. There has been considerable interest recently in semi-supervised learning problems with very few labeled examples (e.g., 1 label per class). The widely used Laplacian regularization is ill-posed at low label rates and gives very poor classification results. In the first part of the talk, we will use the random walk interpretation of the graph Laplacian to precisely characterize the lowest label rate at which Laplacian regularized semi-supervised learning is well-posed. At lower label rates, we will show how our random walk analysis leads to a new algorithm, called Poisson learning, that is probably more stable and informative than Laplace learning. We will also briefly discuss some recent Lipschitz regularity results for graph Laplacians that have applications to improving spectral convergence rates.