EE512A - Advanced Inference in Graphical Models, Fall Quarter, 2014

Last updated: $Id: index.html 1567 2015-02-10 20:25:05Z bilmes $

This page is located at http://j.ee.washington.edu/~bilmes/classes/ee512a_fall_2014/.

A version of this class was taught previously, and its contents and slides can be found at http://j.ee.washington.edu/~bilmes/classes/ee512a_fall_2011/

Instructor:

Prof. Jeff A. Bilmes --- Email me
Office: 418 EE/CS Bldg., +1 206 221 5236
Office hours: TBD, EEB-418 (+ online)

TA

Jounsup Park (jsup517@uw.edu)
Office: EEB-333
Office hours: Tuesdays, 12:00-2:00pm

Time/Location

Class is held: Mon/Wed 11:30am-1:30pm, Paccar Hall, Room 297.

Announcements


Information

Description: This course will cover certain aspects of advanced inference techniques in graphical models.

We will briefly review exact probabilistic inference (which essentially boils down to inference on trees, and includes methods such as junction trees, exact belief propagation and message passing in its various forms, optimal graph triangulations, the elimination family of algorithms, the generalized distributed law (GDL), and its relation to dynamic programming). We'll also cover NP hardness results for inference, NP hardness results for finding the best form, and the inapproximability of inference in graphical models (which seems bad).

Next we'll move to the optimistic aspect of the course, where we will cover two general classes of approximate inference methods.

The first general class of techniques is that of exponential models and variational methods. We will follow the 2008 book by Wainwright and Jordan pretty closely and which you can get from the link above if you are at an educational institution (although I urge you to purchase a copy as it is not too expensive). Of particular interest will be the polyhedral approaches (e.g., the marginal polytope), and the linear programming relaxation methods mentioned in this book.

The second is a class of inference algorithms (more precisely, algorithms for finding the most probable configuration of a set of random variables, or called MPE (most probable explanation) or Viterbi inference) that have become popular in the compute vision community. While we will not dwell on the computer vision applications, we will most abstract the algorithms for use in general graphical models and we might draw from vision examples to show results. These include algorithms that are used when the tree-width of the model very high and that, even in the high tree-width case, can sometimes provide exact MPE solutions in low-order polynomial time. This includes many of the graph cut methods, including higher order approximations, and methods and techniques when the variables are not binary (including alpha-beta swaps, alpha expansions, fusion moves, and other recent more sophisticated and energy aware "move making" algorithms to that can improved efficiency, such as various forms of "expansion" moves). We will also discuss what to do with global potential functions (i.e., ones where a factor might involve many variables).

We will be covering papers of the form that you can find in a new text on inference in Markov random fields for computer vision although, again, we we will talk about the methods more generically (the methods indeed are much more widely applicable than just to problems in computer vision, yet there has been much innovation in graphical model inference in the computer vision community recently. We will also draw from papers that have recently appeared (the links will be posted here on this web page as the class proceeds).

Course Format: Two two-hour lectures per week (MF 13:30-1:20 Paccar (PCAR)-297 per week.

Prerequisites: ideally knowledge in probability, statistics, convex optimization, and some combinatorial optimization although these will be reviewed as necessary. It would be useful to have some basic knowledge of graphical models as we'll review the basics fairly rapidly. The course is open to students in all UW departments. If you are in doubt about taking this course, please talk to me in class or office hours.

Texts: see above.

Grades and Assignments: Grades will be based on a combination of a final project (35%), homeworks (35%), and the take home midterm exam (30%).

There will be between 3-5 homeworks during the quarter. Some of the homeworks will include project proposals, project proposal revisions, and project progress reports.

Final project: The final project will consist of a 4-page paper (conference style) and a final project presentation. The project must involve using or dealing mathematically with graphical models in some way or another. Please contact me and/or stop by office hours early in the quarter to discuss project ideas. Note there is a chance that for the final project, we will instead have a programming contest for probabilistic inference, more on this soon.


Homework

Homework must be done and submitted electronically as a PDF file via the following link https://canvas.uw.edu/courses/914697/assignments. The PDF file, however, can be a scanned copy of hand-written solutions.

Lecture Slides

Lecture slides will be made available as they are being prepared --- they will probably appear soon before a given lecture, and they will be in PDF format (original source is latex). Note, that these slides are corrected after the lecture (and might also include some additional discussion we had during lecture). If you find bugs/typos in these slides, please email me. The slides are available as "presentation slides" and also in (mostly the same content) 2-up form for printing. After lecture, the marked up slides will appear under the "presented slides" column (and might include typo fixes, ink corrections, and other little notes/discussions/drawings that occured during class). Also, the lectures will be posted to youtube (see below, or see my UW youtube channel).
Lec. # Slides 2-Up Slides Lecture Date Last Updated Contents Presented Slides Video
1 pdf pdf 9/29/2014 9/29/2014 Introduction, Families, Semantics pdf
2 pdf pdf 10/1/2014 10/1/2014 MRFs, Inference on Trees pdf
3 pdf pdf 10/6/2014 10/6/2014 Tree inference, more general queries, non-trees. pdf
4 pdf pdf 10/8/2014 10/8/2014 Non-trees, perfect elimination, triangulated graphs pdf
5 pdf pdf 10/13/2014 10/13/2014 triangulated graphs, k-trees, the triangulation process/heuristics pdf
6 pdf pdf 10/15/2014 10/15/2014 multiple queries, decomposable models, junction trees pdf
7 pdf pdf 10/20/2014 10/22/2014 junction trees, begin intersection graphs pdf
8 pdf pdf 10/22/2014 10/22/2014 intersection graphs, inference on junction trees pdf
9 pdf pdf 10/27/2014 11/2/2014 inference on junction trees, semirings, conditioning pdf
10 pdf pdf 11/3/2014 11/4/2014 conditioning, hardness, LBP pdf
11 pdf pdf 11/5/2014 11/9/2014 LBP, exponential models, pdf
12 pdf pdf 11/10/2014 11/10/2014 exponential models, mean params and polytopes, tree outer bound pdf
13 pdf pdf 11/12/2014 11/19/2014 polytopes, tree outer bound, Bethe entropy approx. pdf
14 pdf pdf 11/17/2014 11/17/2014 Bethe entropy approx, loop series pdf
15 pdf pdf 11/19/2014 11/23/2014 Hypergraphs, posets, Mobius, Kikuchi pdf
16 pdf pdf 11/24/2014 11/26/2014 Kikuchi, Expectation Propagation pdf
17 pdf pdf 11/26/2014 12/1/2014 Expectation Propagation, Mean Field pdf
18 pdf pdf 12/1/2014 12/2/2014 Structured mean field, Convex relaxations and upper bounds, tree reweighted case pdf
19 pdf pdf 12/3/2014 12/3/2014 Variational MPE, Graph Cut MPE, LP Relaxations pdf
Lec. # Slides 2-Up Slides Lecture Date Last Updated Contents Presented Slides Video

Discussion Board

You can post questions, discussion topics, or general information at this link.


Relevant Books

There are many books available that discuss some the material that we are covering in this course. Some good books are listed below, but see the end of the lecture slides for books/papers that are relevant to each specific lecture. The first two books we will be covering closely this term. Other very good books that might be useful to you are included below.