CSE 515T: Bayesian Methods in Machine Learning – Spring 2019

Instructor: Professor Roman Garnett
TA: TBD
Time/Location: Monday/Wednesday 4–5:30pm, Duncker 101
Office hours (Garnett): Wednesday 5:30–6:30pm, Duncker 101
syllabus
Piazza message board


Description

This course will cover modern machine learning techniques from a Bayesian probabilistic perspective. Bayesian probability allows us to model and reason about all types of uncertainty. The result is a powerful, consistent framework for approaching many problems that arise in machine learning, including parameter estimation, model comparison, and decision making. We will begin with a high-level introduction to Bayesian inference, then proceed to cover more-advanced topics.

Lectures

Lecture 1: Introduction to the Bayesian Method

Monday, 14 January 2019
lecture notes

Additional Resources:

Lecture 2: Bayesian Inference I (coin flipping)

Wednesday, 16 January 2019
lecture notes

Additional Resources:

Lecture 3: Bayesian Inference II (hypothesis testing and summarizing distributions)

Wednesday, 22 January 2019
lecture notes

Additional Resources:

Resources

Books

There is no required book for this course. That said, there are a wide variety of machine-learning books available, some of which are available for free online. The following books all have a Bayesian slant to them:

  • Pattern Recognition and Machine Learning (PRML) by Christopher M. Bishop. Covers many machine-learning topics thoroughly. Definite Bayesian focus. Can also be very mathematical and take some effort to read.
  • Bayesian Reasoning and Machine Learning (BRML) by David Barber. Geared (as much as a machine-learning book can be!) towards computer scientists. Lots of material on graphical models. Freely available online.
  • Gaussian Processes for Machine Learning (GPML) by Carl Rasmussen and Christopher Williams. Excellent reference for Gaussian processes. Freely available online.
  • Information Theory, Inference, and Learning Algorithms by David J. C. Mackay. Very strong focus on information theory. If you have a background in physics or are interested in information theory, this is the book for you. Freely available online.
For a more-frequentist perspective, check out the excellent The Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani, and Jerome Friedman. Freely available online.

Websites

Other

The Matrix Cookbook by Kaare B. Petersen and Michael S. Pedersen can be incredibly useful for helping with tricky linear alegbra problems!