CSE 515T: Bayesian Methods in Machine Learning – Fall 2019

Instructor: Professor Roman Garnett
TA: Matt Gleeson (glessonm), Adam Kern (adam.kern)
Time/Location: Monday/Wednesday 4–5:20pm, Busch 100
Office hours (Garnett): Wednesday 5:20–6:30pm, Busch 100
syllabus
Piazza message board


Description

This course will cover modern machine learning techniques from a Bayesian probabilistic perspective. Bayesian probability allows us to model and reason about all types of uncertainty. The result is a powerful, consistent framework for approaching many problems that arise in machine learning, including parameter estimation, model comparison, and decision making. We will begin with a high-level introduction to Bayesian inference, then proceed to cover more-advanced topics.

Midterm

Please post questions (as a private message!) to Piazza!

Midterm

Project

You can find detailed information about the project here.

Assignments

Please post questions to Piazza!

Assignment 1, due 23 September 2019.

Assignment 2, due 16 October 2019.

Lectures

Lecture 1: Introduction to the Bayesian Method

Monday, 26 August 2019
lecture notes

Additional Resources:

Lecture 2: Bayesian Inference I (coin flipping)

Wednesday, 28 August 2019
lecture notes

Additional Resources:

Lecture 3: Bayesian Inference II (hypothesis testing and summarizing distributions)

Wednesday, 4 September 2019
lecture notes

Additional Resources:

Lecture 4: Bayesian Inference III (decision theory)

Monday, 9 September 2019
lecture notes

Additional Resources:

Lecture 5: The Gaussian Distribution

Wednesday, 11 September 2019
lecture notes

Additional Resources:

Lecture 6: Bayesian Linear Regression

Monday, 16 September 2019
lecture notes

Additional Resources:

Lecture 7: Bayesian Model Selection

Monday, 23 September 2019
lecture notes

Additional Resources:

Lecture 8: Bayesian Logistic Regression / The Laplace Approximation

Wednesday, 25 September 2019
lecture notes

Additional Resources:

Lecture 9: The Kernel Trick

Monday, 30 September 2019
lecture notes

Additional Resources:

Lecture 10: Gaussian Process Regression

Wednesday, 2 October 2019
lecture slides

Additional Resources/Notes:

Lecture 11: Kernels

Monday, 6 October 2019

Resources/Notes:

Lecture 12: A Quick Interlude

Wednesday, 8 October 2019

Additional Resources/Notes:

Lecture 13: Bayesian Optimization

Wednesday, 16 October 2019
lecture notes

Additional Resources/Notes:

Lecture 14: Bayesian Quadrature

Monday, 21 October 2019
lecture notes

Additional Resources/Notes:

Lecture 15: Sequential Decision Theory / Active Search

Wednesday, 23 October 2019

Additional Resources/Notes:

Lecture 16: GP Classification / Assumed Density Filtering / Expectation Propagation

Wednesday, 30 October 2019
lecture notes

Additional Resources/Notes:

Lecture 17: Practical Issues with GPs

Monday, 4 November 2019

Lecture 18: Monte Carlo, Sampling, Rejection Sampling

Wednesday, 6 November 2019
lecture slides (from Iain Murray's introduction at the 2009 Machine Learning Summer School)

Lecture 19: Bayesian additive regression trees (BART)

Wednesday, 13 November 2019

Additional Resources/Notes:

Resources

Books

There is no required book for this course. That said, there are a wide variety of machine-learning books available, some of which are available for free online. The following books all have a Bayesian slant to them:

For a more-frequentist perspective, check out the excellent The Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani, and Jerome Friedman. Freely available online.

Websites

Other

The Matrix Cookbook by Kaare B. Petersen and Michael S. Pedersen can be incredibly useful for helping with tricky linear alegbra problems!