CSE 544T Special Topics in Computer Science Theory
Fall 2019, Section 1: Convex optimization hierarchies for probabilistic inference and machine learning
Instructor: Brendan Juba
Tuesday/Thursday 11:30am-12:50pm, Cupples I 115
Course Piazza board. Please ask all questions that are not of a personal nature in a public post to the Piazza board.
An introduction to the "sum-of-squares" hierarchy of semidefinite program relaxations for probabilistic inference problems. We will see how these methods can be used to obtain state-of-the-art guarantees for some fundamental machine learning tasks.
Prerequisites: CSE 347 or CSE 541T; Math 309 or equivalent. CSE 417T and/or ESE 326/Math 3200 recommended.
Grades and assignments:
Each week, a student will select and present one of the papers from the list below over the course of two meetings. Grades will be based on these presentations, and it is expected that each student will present one paper. It is expected that students will attend others' presentations and participate in discussion of the presentations/Q&A.
Additional background reading:
The first two lectures will give an overview of the sum-of-squares framework for algorithms. Additional details can be found in the following books and survey articles.
To be filled in as the semester progresses:
List of Papers
dictionary learning & tensor decomposition
weaknesses via pseudocalibration, conversion to spectral algorithms