Course contents *
Information theory of discrete and continuous variables: entropy, Kraft inequality, relative entropy, entropy rate, redundancy rate, mutual information, asymptotic equipartition. Estimation of probability density and probability mass functions. Expectation-Maximization algorithm. Maximum entropy principle.
Lossless coding: nonadaptive codes: Shannon, Huffmann, arithmetic codes. Universal and adaptive codes. Ziv-Lempel codes.
Rate-distortion theory: the rate-distortion function, Shannon lower bound, rate distribution over independent variables, reverse waterfilling, Blahut algorithm.
High-rate quantization: constrained-resolution and constrained-entropy quantization. Vector versus scalar quantization. Practical high-rate-theory-based quantizers: mixture and lattice quantizers, companding.
Low-rate quantization: Lloyd training algorithm for constrained-resolution and constrained-entropy cases. Structured vector quantization (tree-structured, multi-stage, gain-shape, lattice). Fast search methods.
Transforms and filter banks: bases and frames. Transforms and filter banks. Fixed transforms: DFT, DCT, MLT, Gabor frames, Balian-Low theorem. A-priori adaptation: Karhunen-Loeve, a-priori energy concentration. A-posteriori adaptation: a-posteriori energy concentration, best-basis search, matching pursuit.
Linear prediction: closed-loop prediction, noise-shaping, analysis-by-synthesis, spectral flatness, Kolmogorov's Formula, spectral flatness, redundancy, forward and backward prediction.