Hoppa till huvudinnehållet

DD2420 Probabilistic Graphical Models 7,5 hp

Course memo Spring 2023-60023

Version 1 – 08/17/2022, 11:00:49 AM

Course offering

PGM23 (Start date 17/01/2023, English)

Language Of Instruction

English

Offered By

EECS/Intelligent Systems

Course memo Spring 2023

Course presentation

This is an advanced course with a flexible set of activities that allow students to chose and between current research applications and theoretical topics to explore.    Probabilistic Graphical models are a foundation for understanding many methods of  artificial intelligence, machine learning and estimation. 

In AI, decisions are made by computers that improve with learning.  To do that, programs must perform inference of estimated probabilities given some evidence.  That inference can be  intractable.   The methods learned in this course will allow the student to formulate the AI problem and do both exact and approximate inference. 

Machine learning provides algorithms for solving problems by using training data.  This course will give insight into how to formulate problems so that machine learning can be used effectively.   While end-to-end learning using generic machine learning methods is a successful approach, data is often limited.  Building good models can help learn with less data by constraining the learning space.  

Bayesian models are at the heart of most estimation methods.  Formulation of the these models is the first step in developing an estimation algorithm.  The estimation itself is in many cases just inference on the model given some evidence.   Approximate inference techniques such as those covered in this course are important in solving many very hard estimation problems in science and engineering. 

Headings denoted with an asterisk ( * ) is retrieved from the course syllabus version Spring 2021

Content and learning outcomes

Course contents

The main contents of the course are:

Graph representations: discriminative and generative models, Bayesian nets (DAG), undirected graphical models (MRF/factor graphs), exponential distributions, D-separation, Markov blanket.

Exact inference: messsage passing, variable elimination, Factor graphs from DAG, clique graphs/trees, inferences with evidence, junction tree algorithm etc

Approximate inference: ”Loopy belief” - propagation, the Monte Carlo principen, (Markov Chain Monte Carlo (MCMC), variational methods, MAP-inference etc

Learning: parameter estimation, the maximum likelihood method, conjugate prior, Gaussian, Beta and Dirichlet distributions, partially observed data, the gradient ascent method, Expectation Maximization (EM) etc

Intended learning outcomes

After passing the course, the student should be able to

  • explain and discuss how different graphs represent both factorization and independent relations
  • explain and discuss exact inference in graphical models
  • use message passing  algorithms for inference
  • explain and discuss methods for learning uncertainties in a model's parameters
  • explain and discuss approximate inference methods such as sampling, ”loopy belief”  propagation and variational methods.

Students can obtain higher grades by explaining how the methods above can be used to solve specific problems. Highest grade can be obtained by explaining complex real research with these methods.

Learning activities

The course consists of a series of 8 lectures in which the core concepts will be covered. These lectures all have required reading to be completed before the lecture from the

Course book:

Probabilistic Graphical Models, Principles and  Techniques, by Daphne Koller and  Nir Friedman.

That basic knowledge is examined by four homework assignements that are orally examined.

In addition, there are three required tutorials that serve both as teaching and examination covering the first three learning goals.

There is no Final Exam. 

To obtain a grade above E one can choose to do upto 4 additional tutorials from a selection of 7 tutorials.  These vary in form and level of difficulty.  All tutorials have a written uploaded report and an individual oral examination. Some have separate theory and practical parts with the theory being covered in a separate seminar.  The seminars include the assignment being disscussed in groups.  

The tutorials give a deeper knowledge and practical examples.  Topics include: Factor Graphs (SLAM); Imitation Learning; Partially Observed Data; Markov Chain Monte Carlo; Variational Inference on Gaussian Mixture Models;  Latent Dirichlet Allocation; and Variational inference with Sequential Monte Carlo.

Doing these tutorials will require that the student actively seeks to fill any gaps in knowledge.  Some of the topics are quite advanced and one must read the referenced material in order to understand the assignment fully.  Passive students will struggle to do the advanced tutorials.

Detailed plan


The lectures are coupled to the HW's and tutorials according to this table:

 

Lectures HW (Tutorial Prerequisites) Tutorials  Learning Goal Required For Passing 
1-3   1 Message Passing

Representations; Exact Inference; 

Tutorial 1
4  

2 Bayes Nets;

3 Conditional Random Fields

Representations; Exact Inference; Learning

Tutorial 2

Tutorial 3

5

HW A - Learning    

 

4 GraphSLAM;

6 Imitation Learning

Learning HW A
5

HW B - Partially Observable Data

5 Partial Observable Data;

Learning HW B
6 HW C - Monte Carlo  Methods 7 Markov Chain Monte Carlo  Approximate Inference HW C
7 HW D - Variational Inference

8 Variational Inference  on Gaussian Mixture Models;

9 Latent Dirichlet Allocation;

10 Variational Inference with Sequential Monte Carlo

Approximate Inference; Learning HW D

 

 

 

The ugly colors are to separate weeks that that the lectures are given.  Typically the HW's can be done the same week and the Tutorials the following week.

Lectures:

The recommended reading should be done in preparation for each lecture. 

 

Preparations before course start

Recommended prerequisites

SF1625 one variable calculus;

SF1901 Probability and statistics;

either DD2421 Machine Learning or DD2434 Machine Learning Advanced Course;

Programming in matlab and python.

Literature

Probabilistic Graphical Models, Principles and  Techniques, by Daphne Koller and  Nir Friedman.

Examination and completion

Grading scale

A, B, C, D, E, FX, F

Examination

  • OVN1 - Exercises, 2.5 credits, Grading scale: P, F
  • OVN2 - Exercises, 2.5 credits, Grading scale: P, F
  • TENT - Written exam, 2.5 credits, Grading scale: P, F

Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.

The examiner may apply another examination format when re-examining individual students.

The section below is not retrieved from the course syllabus:

Exercises ( OVN1 )

Exercises ( OVN2 )

Written exam ( TENT )

Ethical approach

  • All members of a group are responsible for the group's work.
  • In any assessment, every student shall honestly disclose any help received and sources used.
  • In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.

Further information

No information inserted

Round Facts

Start date

Missing mandatory information

Course offering

  • PGM23 Spring 2023-60023

Language Of Instruction

English

Offered By

EECS/Intelligent Systems