CSE 515T: Bayesian Methods in Machine Learning – Spring 2019
Instructor: Professor Roman Garnett
Time/Location: Monday/Wednesday 4–5:30pm, Duncker 101
Office hours (Garnett): Wednesday 5:30–6:30pm, Duncker 101
Piazza message board
This course will cover modern machine learning techniques from a Bayesian probabilistic perspective. Bayesian probability allows us to model and reason about all types of uncertainty. The result is a powerful, consistent framework for approaching many problems that arise in machine learning, including parameter estimation, model comparison, and decision making. We will begin with a high-level introduction to Bayesian inference, then proceed to cover more-advanced topics.
Lecture 1: Introduction to the Bayesian Method
Monday, 14 January 2019
Lecture 2: Bayesian Inference I (coin flipping)
Wednesday, 16 January 2019
Lecture 3: Bayesian Inference II (hypothesis testing and summarizing distributions)
Wednesday, 22 January 2019
- Article: "The Fallacy of Placing Confidence in Confidence Intervals" available here or here
There is no required book for this course. That said, there are a wide variety of machine-learning books available, some of which are available for free online. The following books all have a Bayesian slant to them:
For a more-frequentist perspective, check out the excellent The Elements of Statistical Learning by Freely available online.
- Pattern Recognition and Machine Learning (PRML) by Covers many machine-learning topics thoroughly. Definite Bayesian focus. Can also be very mathematical and take some effort to read.
- Bayesian Reasoning and Machine Learning (BRML) by Geared (as much as a machine-learning book can be!) towards computer scientists. Lots of material on graphical models. Freely available online.
- Gaussian Processes for Machine Learning (GPML) by Excellent reference for Gaussian processes. Freely available online.
- Information Theory, Inference, and Learning Algorithms by Very strong focus on information theory. If you have a background in physics or are interested in information theory, this is the book for you. Freely available online.
- I will post the source for lecture notes, demo code, etc. on this GitHub page. Even the source for the syllabus and this website are there.
- I have created a Piazza message board for this class. Please post any questions about the homework, etc. to the message board! Chances are that someone else has the same question and we can all benefit from a public discussion. If you have a question just for me and/or me and the TA, please also post this to Piazza rather than emailing us directly; you should be able to mark your message appropriately to keep it private.
- Metacademy's roadmap to Bayesian machine learning. This is a great resource for finding additional materials related to essentially every subject we will cover in this course.
- There are several relevant courses available on Coursera. Coursera gives you access to video lecture series, often from world experts, all available for free! In particular, the following three courses are all presented by leaders in the field:
The Matrix Cookbook by Kaare B. Petersen and Michael S. Pedersen can be incredibly useful for helping with tricky linear alegbra problems!