• Svenska

# DD2420 Probabilistic Graphical Models 7.5 credits

This is an advanced course with a flexible set of activities that allow students to chose and between current research applications and theoretical topics to explore.    Probabilistic Graphical models are a foundation for understanding many methods of  artificial intelligence, machine learning and estimation.

In AI, decisions are made by computers that improve with learning.  To do that, programs must perform inference of estimated probabilities given some evidence.  That inference can be  intractable.   The methods learned in this course will allow the student to formulate the AI problem and do both exact and approximate inference.

Machine learning provides algorithms for solving problems by using training data.  This course will give insight into how to formulate problems so that machine learning can be used effectively.   While end-to-end learning using generic machine learning methods is a successful approach, data is often limited.  Building good models can help learn with less data by constraining the learning space.

Bayesian models are at the heart of most estimation methods.  Formulation of the these models is the first step in developing an estimation algorithm.  The estimation itself is in many cases just inference on the model given some evidence.   Approximate inference techniques such as those covered in this course are important in solving many very hard estimation problems in science and engineering.

### Choose semester and course offering

Choose semester and course offering to see current information and more about the course, such as course syllabus, study period, and application information.

## Application

### For course offering

Spring 2025 PGM25 programme students

### Application code

60212

Headings with content from the Course syllabus DD2420 (Autumn 2024–) are denoted with an asterisk ( )

## Content and learning outcomes

### Course contents

Graph representations: discriminative and generative models, Bayesian nets (DAG), undirected graphical models (MRF/factor graphs), exponential distributions, D-separation, Markov blanket.

Exact inference: messsage passing, variable elimination, Factor graphs from DAG, clique graphs/trees, inferences with evidence, junction tree algorithm etc

Approximate inference: ”Loopy belief” - propagation, the Monte Carlo principen, (Markov Chain Monte Carlo (MCMC), variational methods, MAP-inference etc

Learning: parameter estimation, the maximum likelihood method, conjugate prior, Gaussian, Beta and Dirichlet distributions, partially observed data, the gradient ascent method, Expectation Maximization (EM) etc

### Intended learning outcomes

After passing the course, the student should be able to

• explain and discuss how different graphs represent both factorization and independent relations
• explain and discuss exact inference in graphical models
• use message passing  algorithms for inference
• explain and discuss methods for learning uncertainties in a model's parameters
• explain and discuss approximate inference methods such as sampling, ”loopy belief”  propagation and variational methods.

Students can obtain higher grades by explaining how the methods above can be used to solve specific problems. Highest grade can be obtained by explaining complex real research with these methods.

## Literature and preparations

### Specific prerequisites

Completed courses in all of the following fields:

• Knowledge and skills in programming equivalent to completed course DD1310-DD1319/DD1331/DD1337/DD100N/ID1018.
• Knowledge in linear algebra equivalent to completed course SF1624/SF1672/SF1684.
• Knowledge in multivariable calculus equivalent to completed course SF1626/SF1674.
• Knowledge in probability and statistics equivalent to completed course SF1910-SF1924/SF1935.
• Knowledge in basic machine learning equivalent to completed course DD1420/DD2421.

Active participation in a course offering of DD21420/DD2421 where the final examination is not yet reported in LADOK is considered equivalent to completion of the course.
Registering for a course is counted as active participation.
The term 'final examination' encompasses both the regular examination and the first re-examination.

### Recommended prerequisites

SF1625 one variable calculus;

SF1901 Probability and statistics;

either DD2421 Machine Learning or DD2434 Machine Learning Advanced Course;

Programming in matlab and python.

### Equipment

No information inserted

### Literature

No information inserted

## Examination and completion

If the course is discontinued, students may request to be examined during the following two academic years.

A, B, C, D, E, FX, F

### Examination

• KON1 - Written partial exams, 2.5 credits, grading scale: P, F
• OVN1 - Exercises, 2.5 credits, grading scale: P, F
• OVN2 - Exercises, 2.5 credits, grading scale: P, F

Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.

The examiner may apply another examination format when re-examining individual students.

The possibility of re-examination of all written partial exams (KON1) is given under the examination period at the end of the course.

The final grade is based on how well the student performed OVN1, OVN2 and KON1 in combination.

### Opportunity to complete the requirements via supplementary examination

No information inserted

### Opportunity to raise an approved grade via renewed examination

No information inserted

### Ethical approach

• All members of a group are responsible for the group's work.
• In any assessment, every student shall honestly disclose any help received and sources used.
• In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.

## Further information

### Course room in Canvas

Registered students find further information about the implementation of the course in the course room in Canvas. A link to the course room can be found under the tab Studies in the Personal menu at the start of the course.

### Main field of study

Computer Science and Engineering

### Education cycle

Second cycle

DD2424 Deep learning in data science

DD2432 Artificial Neural Networks and Other Learning Systems

DD2423 Image Analysis and Computer Vision

DD2425 Robotics and Autonomous Systems

DD2429 Computational Photography

EL2320 Applied Estimation

### Contact

John Folkesson johnf@kth.se

### Transitional regulations

The previous modules PRO1, PRO2 are replaced by OVN1, OVN2.

The previous modules TEN1 and TENT are replaced by KON1.

### Supplementary information

This course cannot be counted in the degree if the student has taken DD2447

In this course, the EECS code of honor applies, see:
http://www.kth.se/en/eecs/utbildning/hederskodex