Markov chains, Conditional Independence, Bayesian inference, forward-backward algorithm, Baum-Welch algorithm, Viterbi algorithm, extensions: factorial hidden Markov model, hidden semi-Markov models, dynamic Bayesian networks.
Project work (modeling, analysis) on an application of interest for the student.
This course presents an overview of the most important methods of computation and modelling by HMMs and their extensions.
This course focuses primarily on the computational and modeling aspects and will not cover the asymptotic theory (ergodicity e.t.c.) of HMM. Computer-aided project work with datasets forms the essential learning activity.
To pass the course, the student should be able to do the following:
to recognize a situation, where the basic HMMs can be regarded as promising model candidates.
to recognize a situation, where the extensions of HMMs can be regarded as promising model candidates.
be able to implement the basic algorithms with suitable modifications for the data at hand.
be able to implement algorithms for choice of model family (state space topology) in HMM
to know the main papers on HMMs
to place the HMMs in the general picture of statistical learning theory
to write at technical report that in a concise technical prose describes the work done in analysing, validating and testing an HMM for a problem.