CSE 515T: Bayesian Methods in Machine Learning – Spring 2017

Instructor: Professor Roman Garnett
TA: Gustavo Malkomes
Time/Location: Tuesday/Thursday 2:30–4pm, Whitaker 218
Office hours (Garnett): Tuesday 4:15–5pm, Jolley Hall 504
Office hours (Malkomes): Friday 3–4pm, Lopata Hall 202
syllabus
Piazza message board


Description

This course will cover modern machine learning techniques from a Bayesian probabilistic perspective. Bayesian probability allows us to model and reason about all types of uncertainty. The result is a powerful, consistent framework for approaching many problems that arise in machine learning, including parameter estimation, model comparison, and decision making. We will begin with a high-level introduction to Bayesian inference, then proceed to cover more-advanced topics.

Midterm

Please post questions (as a private message!) to Piazza!

Midterm

Assignments

Please post questions to Piazza!

Assignment 1, due 7 February, 2017.
Assignment 2, due 21 March, 2017.

Project

You can find more info on the project here, including some ideas and datasets, etc.

Lectures

Lecture 1: Introduction to the Bayesian Method

Tuesday, 17 January 2017
lecture notes

Additional Resources:

Lecture 2: Bayesian Inference I (coin flipping)

Thursday, 19 January 2017
lecture notes

Additional Resources:

Lecture 3: Bayesian Inference II (decision theory)

Tuesday, 24 January 2017
lecture notes

Additional Resources:

Lecture 4: The Gaussian Distribution

Tuesday, 31 January 2017
lecture notes

Additional Resources:

Lecture 5: Bayesian Linear Regression

Thursday, 2 February 2017
lecture notes

Additional Resources:

Lecture 6: Bayesian Model Selection

Thursday, 9 February 2017
lecture notes

Additional Resources:

Lecture 7: Bayesian Logistic Regression / The Laplace Approximation

Tuesday, 14 February 2017
lecture notes

Additional Resources:

Lecture 8: The Kernel Trick

Thursday, 16 February 2017
lecture notes

Additional Resources:

Lecture 9: Gaussian Process Regression

Tuesday, 21 February 2017
lecture slides

Additional Resources/Notes:

Lecture 10: Kernels

Thursday, 23 March 2017

Resources/Notes:

Lecture 11: Bayesian Quadrature

Tuesday, 27 February 2017
lecture notes

Additional Resources/Notes:

Lecture 12: GP Classification / Assumed Density Filtering / Expectation Propagation

Thursday, 2 March 2017
lecture notes

Additional Resources/Notes:

Lecture 13: Bayesian Optimization

Tuesday, 7 March 2017
lecture notes

Additional Resources/Notes:

Lecture 14: Monte Carlo, Sampling, Rejection Sampling

Thursday, 23 March 2017
lecture slides (from Iain Murray's introduction at the 2009 Machine Learning Summer School)

Additional Resources/Notes:

Lecture 15: Importance Sampling, MCMC

Tuesday, 28 March 2017
lecture slides (from Iain Murray's introduction at the 2009 Machine Learning Summer School)

Additional Resources/Notes:

Lecture 16: The Kalman Filter

Thursday, 30 March 2017
lecture notes

Additional Resources/Notes:

Lecture 17: Bayesian Deep Learning?

Tuesday, 11 April 2017

Additional Resources/Notes:

Resources

Books

There is no required book for this course. That said, there are a wide variety of machine-learning books available, some of which are available for free online. The following books all have a Bayesian slant to them:

For a more-frequentist perspective, check out the excellent The Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani, and Jerome Friedman. Freely available online.

Websites

Other

The Matrix Cookbook by Kaare B. Petersen and Michael S. Pedersen can be incredibly useful for helping with tricky linear alegbra problems!