Intended learning outcomes *
After passing this course, participants should be able to
- describe and use the principles of information theory, like entropy, mutual information, asymptotic equipartition, data processing, prefix codes, Kraft inequality, noiseless source coding, maximum entropy, rate distortion, noisy source coding, Shannon lower bound, backward channel, reverse waterfilling, energy concentration, etc. to develop source coding algorithms,
- develop source coding schemes for lossless coding, like Huffman coding, arithmetic coding, Lempel-Ziv coding, universal source coding,
- develop source coding schemes for lossy coding, like scalar and vector quantization, Lloyd-Max quantization, entropy-constrained quantization, high-rate quantization, transform coding, predictive coding,
- implement (for example with MatLab) and assess the developed source coding schemes / algorithms,
- explain coding design choices using the principles of information theory,
- develop source coding schemes for a given source coding problem,
- model and assess source coding schemes using the principles of information theory,
- analyze given source coding problems, identify and explain the challenges, propose possible solutions, and explain the chosen design.
To achive higher grades, participants should also be able to
- solve more advanced problems in all areas mentioned above.