CSE 515T: Bayesian Methods in Machine Learning – Spring 2018

Instructor: Professor Roman Garnett
TA: Shali Jiang
Time/Location: Monday/Wednesday 4–5:30pm, Busch 100
Office hours (Garnett): Wednesday 5:30–6:30pm, Jolley Hall 504
Office hours (Jiang): TBA
syllabus
Piazza message board


Description

This course will cover modern machine learning techniques from a Bayesian probabilistic perspective. Bayesian probability allows us to model and reason about all types of uncertainty. The result is a powerful, consistent framework for approaching many problems that arise in machine learning, including parameter estimation, model comparison, and decision making. We will begin with a high-level introduction to Bayesian inference, then proceed to cover more-advanced topics.

Assignments

Please post questions to Piazza!

Assignment 1, due 7 February, 2018.
Assignment 2, due 26 February, 2018.

Project

You can find more info on the project here, including some ideas and datasets, etc.

Lectures

Lecture 1: Introduction to the Bayesian Method

Wednesday, 17 January 2018
lecture notes

Additional Resources:

Lecture 2: Bayesian Inference I (coin flipping)

Monday, 22 January 2018
lecture notes

Additional Resources:

Lecture 3: Bayesian Inference II (hypothesis testing and summarizing distributions)

Wednesday, 24 January 2018
(lecture notes coming soon)

Additional Resources:

Lecture 4: Bayesian Inference III (decision theory)

Monday, 29 January 2018
lecture notes

Additional Resources:

Lecture 5: The Gaussian Distribution

Wednesday, 31 January 2018
lecture notes

Additional Resources:

Lecture 6: Bayesian Linear Regression

Monday, 5 February 2018
lecture notes

Additional Resources:

Lecture 7: Bayesian Model Selection

Wednesday, 7 February 2018
lecture notes

Additional Resources:

Lecture 8: Bayesian Logistic Regression / The Laplace Approximation

Monday, 12 February 2018
lecture notes

Additional Resources:

Lecture 9: The Kernel Trick

Wednesday, 14 February 2018
lecture notes

Additional Resources:

Lecture 10: Gaussian Process Regression

Monday, 19 February 2018
lecture slides

Additional Resources/Notes:

Lecture 11: Kernels

Wednesday, 21 February 2017

Resources/Notes:

Resources

Books

There is no required book for this course. That said, there are a wide variety of machine-learning books available, some of which are available for free online. The following books all have a Bayesian slant to them:

  • Pattern Recognition and Machine Learning (PRML) by Christopher M. Bishop. Covers many machine-learning topics thoroughly. Definite Bayesian focus. Can also be very mathematical and take some effort to read.
  • Bayesian Reasoning and Machine Learning (BRML) by David Barber. Geared (as much as a machine-learning book can be!) towards computer scientists. Lots of material on graphical models. Freely available online.
  • Gaussian Processes for Machine Learning (GPML) by Carl Rasmussen and Christopher Williams. Excellent reference for Gaussian processes. Freely available online.
  • Information Theory, Inference, and Learning Algorithms by David J. C. Mackay. Very strong focus on information theory. If you have a background in physics or are interested in information theory, this is the book for you. Freely available online.
For a more-frequentist perspective, check out the excellent The Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani, and Jerome Friedman. Freely available online.

Websites

Other

The Matrix Cookbook by Kaare B. Petersen and Michael S. Pedersen can be incredibly useful for helping with tricky linear alegbra problems!