DD2420 Probabilistic Graphical Models 7.5 credits

This is a relatively advanced course with a flexible set of activities that allow students to chose and between current research applications and theoretical topics to explore. Probabilistic Graphical models are a foundation for understanding many methods of artificial intelligence, machine learning and estimation.
In AI, decisions are made by computers that improve with learning. To do that, programs must perform inference of estimated probabilities given some evidence. That inference can be intractable. The methods learned in this course will allow the student to formulate the AI problem and do both exact and approximate inference.
Machine learning provides algorithms for solving problems by using training data. This course will give insight into how to formulate problems so that machine learning can be used effectively. While end-to-end learning using generic machine learning methods is a successful approach, data is often limited. Building good models can help learn with less data by constraining the learning space.
Bayesian models are at the heart of most estimation methods. Formulation of the these models is the first step in developing an estimation algorithm. The estimation itself is in many cases just inference on the model given some evidence. Approximate inference techniques such as those covered in this course are important in solving many very hard estimation problems in science and engineering.
Choose semester and course offering
Choose semester and course offering to see current information and more about the course, such as course syllabus, study period, and application information.
Content and learning outcomes
Course contents
The main contents of the course are:
Graph representations: discriminative and generative models, Bayesian nets (DAG), undirected graphical models (MRF/factor graphs), exponential distributions, D-separation, Markov blanket.
Exact inference: messsage passing, variable elimination, Factor graphs from DAG, clique graphs/trees, inferences with evidence, junction tree algorithm etc
Approximate inference: ”Loopy belief” - propagation, the Monte Carlo principen, (Markov Chain Monte Carlo (MCMC), variational methods, MAP-inference etc
Learning: parameter estimation, the maximum likelihood method, conjugate prior, Gaussian, Beta and Dirichlet distributions, partially observed data, the gradient ascent method, Expectation Maximization (EM) etc
Intended learning outcomes
After passing the course, the student should be able to
- explain and discuss how different graphs represent both factorization and independent relations
- explain and discuss exact inference in graphical models
- use message passing algorithms for inference
- explain and discuss methods for learning uncertainties in a model's parameters
- explain and discuss approximate inference methods such as sampling, ”loopy belief” propagation and variational methods.
Students can obtain higher grades by explaining how the methods above can be used to solve specific problems. Highest grade can be obtained by explaining complex real research with these methods.
Course disposition
Literature and preparations
Specific prerequisites
Completed courses in all of the following fields:
- Programming equivalent DD1310/DD1311/DD1312/DD1314/DD1315/DD1316/DD1318/DD1331/DD1337/DD100N/ID1018.
- Algebra and Geometry equivalent SF1624.
- The equivalent SF1626 of multivariable analysis.
- Probability and Statistics equivalent SF1901.
- Basic machine learning equivalent DD2421.
Active participation in a course offering where the final examination is not yet reported in LADOK is considered equivalent to completion of the course.
Registering for a course is counted as active participation.
The term 'final examination' encompasses both the regular examination and the first re-examination.
Recommended prerequisites
SF1625 one variable calculus;
SF1901 Probability and statistics;
either DD2421 Machine Learning or DD2434 Machine Learning Advanced Course;
Programming in matlab and python.
Equipment
Literature
Examination and completion
If the course is discontinued, students may request to be examined during the following two academic years.
Grading scale
Examination
- OVN1 - Exercises, 2.5 credits, grading scale: P, F
- OVN2 - Exercises, 2.5 credits, grading scale: P, F
- TENT - Written exam, 2.5 credits, grading scale: P, F
Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.
The examiner may apply another examination format when re-examining individual students.
Opportunity to complete the requirements via supplementary examination
Opportunity to raise an approved grade via renewed examination
Examiner
Ethical approach
- All members of a group are responsible for the group's work.
- In any assessment, every student shall honestly disclose any help received and sources used.
- In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.
Further information
Course web
Further information about the course can be found on the Course web at the link below. Information on the Course web will later be moved to this site.
Course web DD2420Offered by
Main field of study
Education cycle
Add-on studies
DD2434 Machine Learning, advanced course
DD2424 Deep learning in data science
DD2432 Artificial Neural Networks and Other Learning Systems
DD2423 Image Analysis and Computer Vision
DD2425 Robotics and Autonomous Systems
DD2429 Computational Photography
EL2320 Applied Estimation
Contact
Transitional regulations
The earlier test parts PRO1, PRO2 and TEN1 have been replaced by OVN1, OVN2 and TENT respectively.
Supplementary information
This course cannot be counted in the degree if the student has taken DD2447
In this course, the EECS code of honor applies, see:
http://www.kth.se/en/eecs/utbildning/hederskodex