CSE 515T: Bayesian Methods in Machine Learning – Spring 2018
Instructor: Professor Roman Garnett
TA: Shali Jiang
Time/Location: Monday/Wednesday 4–5:30pm, Busch 100
Office hours (Garnett): Wednesday 5:30–6:30pm, Jolley Hall 504
Office hours (Jiang): TBA
Piazza message board
This course will cover modern machine learning techniques from a Bayesian probabilistic perspective. Bayesian probability allows us to model and reason about all types of uncertainty. The result is a powerful, consistent framework for approaching many problems that arise in machine learning, including parameter estimation, model comparison, and decision making. We will begin with a high-level introduction to Bayesian inference, then proceed to cover more-advanced topics.
Lecture 1: Introduction to the Bayesian Method
Wednesday, 17 January 2018
Lecture 2: Bayesian Inference I (coin flipping)
Monday, 22 January 2018
Lecture 3: Bayesian Inference II (hypothesis testing and summarizing distributions)
Wednesday, 24 January 2018
(lecture notes coming soon)
- Article: "The Fallacy of Placing Confidence in Confidence Intervals" available here or here
Lecture 4: Bayesian Inference III (decision theory)
Monday, 29 January 2018
- Book: Bishop PRML: Section 1.5 (Decision theory)
- Book: Berger Chapter 1 (Basic concepts), Section 4.4 (Bayesian decision theory)
- Book: Robert Section 4.2 (Bayesian decision theory)
- Videos: YouTube user mathematicalmonk has a great series of machine-learning lectures available. Chapter 11 concerns decision theroy.
Lecture 5: The Gaussian Distribution
Wednesday, 31 January 2018
- Book: Bishop PRML: Section 2.3 (The Gaussian Distribution). This is a truly excellent and in-depth discussion!
- Book: Barber BRML: Section 8.4 (Multivariate Gaussian).
- Book/reference: Rasmussen and Williams GPML: Section A.2 (Gaussian Identities), available here. This is a good cheat sheet!
- Notes: Chuong B. Do put together some notes on the multivariate Gaussian for the Stanford machine learning class here. These go a bit more in depth than my notes, if you want to see more details.
- Website: The Wikipedia articles on the normal distribution and the multivariate normal distribution are quite complete.
- Video: YouTube user mathematicalmonk has a lecture on the multivariate normal available as well.
- Video: Alexander Ihler also has a lecture on the multivariate normal, including information on how to sample from the distribution.
Lecture 6: Bayesian Linear Regression
Monday, 5 February 2018
- Book: Bishop PRML: Section 3.3 (Bayesian Linear Regression).
- Book: Barber BRML: Section 18.1 (Regression with Additive Gaussian Noise).
- Book: Rasmussen and Williams GPML: Section 2.1 (Weight-space View), available here.
- Video: YouTube user mathematicalmonk has an entire section devoted to Bayesian linear regression. See ML 10.1–7 here.
- Videos: Nando de Freitas has a series of lectures on Bayesian linear regression. Part one is here, and part two is here.
Lecture 7: Bayesian Model Selection
Wednesday, 7 February 2018
- Book: Bishop PRML: Section 3.4 (Bayesian Model Comparison).
- Book: Barber BRML: Chapter 12 (Bayesian Model Selection).
- Book: MacKay ITILA: Chapter 28 (Occam's Razor and Model Comparison).
- Video: YouTube user mathematicalmonk has a lecture about Bayesian model selection (some nearby videos are related as well).
Lecture 8: Bayesian Logistic Regression / The Laplace Approximation
Monday, 12 February 2018
- Book: Bishop PRML: Chapter 4 (Linear Models for Classificaiton).
- Book: Barber BRML: Section 18.2 (Classification).
- Book: Rasmussen and Williams GPML: Sections 3.1 and 3.2 (Classification Problems and Linear Models for Classification), available here.
- Video: YouTube user mathematicalmonk has a lecture about Bayesian logistic regression.
Lecture 9: The Kernel Trick
Wednesday, 14 February 2018
- Book: Rasmussen and Williams GPML: Chapter 2 through 2.1 (Weight-space View), available here.
Lecture 10: Gaussian Process Regression
Monday, 19 February 2018
- Book: Rasmussen and Williams GPML: Sections 2.2 – 2.5, available here.
- Book: Barber BRML: Chapter 19 (Gaussian processes).
- Video: Nando de Freitas has a lecture here.
- Video: Philipp Hennig has a series of lectures from the 2013 Machine Learning Summer School; part one is here. The slides, which have some cool animations, are available here.
- Video: Carl Rasmussen has a two-part introduction to Gaussian processes here.
- Video: David MacKay gave an introduction to Gaussian processes here.
Lecture 11: Kernels
Wednesday, 21 February 2017
There is no required book for this course. That said, there are a wide variety of machine-learning books available, some of which are available for free online. The following books all have a Bayesian slant to them:
For a more-frequentist perspective, check out the excellent The Elements of Statistical Learning by Freely available online.
- Pattern Recognition and Machine Learning (PRML) by Covers many machine-learning topics thoroughly. Definite Bayesian focus. Can also be very mathematical and take some effort to read.
- Bayesian Reasoning and Machine Learning (BRML) by Geared (as much as a machine-learning book can be!) towards computer scientists. Lots of material on graphical models. Freely available online.
- Gaussian Processes for Machine Learning (GPML) by Excellent reference for Gaussian processes. Freely available online.
- Information Theory, Inference, and Learning Algorithms by Very strong focus on information theory. If you have a background in physics or are interested in information theory, this is the book for you. Freely available online.
- I will post the source for lecture notes, demo code, etc. on this GitHub page. Even the source for the syllabus and this website are there.
- I have created a Piazza message board for this class. Please post any questions about the homework, etc. to the message board! Chances are that someone else has the same question and we can all benefit from a public discussion. If you have a question just for me and/or me and the TA, please also post this to Piazza rather than emailing us directly; you should be able to mark your message appropriately to keep it private.
- Metacademy's roadmap to Bayesian machine learning. This is a great resource for finding additional materials related to essentially every subject we will cover in this course.
- There are several relevant courses available on Coursera. Coursera gives you access to video lecture series, often from world experts, all available for free! In particular, the following three courses are all presented by leaders in the field:
The Matrix Cookbook by Kaare B. Petersen and Michael S. Pedersen can be incredibly useful for helping with tricky linear alegbra problems!