Till KTH:s startsida Till KTH:s startsida

Schedule

Preliminary schedule

Homeworks within parentheses are additional problems for FEL3202.

  1. IntroductionUpdated slides (Friday 15/5, 15-17) . Chapter 1-2 in Lecture Notes (LN). Chapter 1-2 in Ljung 
    • Signals and systems
    • The basic problem
    • Some examples
    • Introduction to parameter estimation
    • Some pitfalls
    • HW: 1.1 a-d  (1.1e). 2.1 (2.2, 2.5) ) Deadline Tuesday 26/5.
  2. Probabilistic models Updated slides (Tuesday 19/5, 10-12). Chapter 3 in LN. Chapter 4 in Ljung. 
    • Models and model structures
    • Estimators
    • A probabilistic toolshed
    • HW: 3.3 a-f (g,h), 3.4 a (b,c) . Deadline: Friday 29/5.
  3. Probabilistic models continued, and estimation theory (Tuesday 26/5, 10-12). Sections 3.4.7-4.2 in LN. Chapter 3 in Ljung 
    • A probabilistic toolshed continued
      • Stationary processes
      • Wide-sense stationarity
      • Quasi-stationarity
      • Frequency domain characterization
      • A swatch of building blocks
    • Estimation theory
      • Information contents in random variables
      • Estimation of random variables
    • HW 4.1, 4.3, 4.7f,h (3.6,4.5,4.7g). Deadline: See excel file
  4. Wold decomposition and unbiased parameter estimation (Friday 29/5, 15-17). Sections 4.3-5.9 in LN. Chapter 7 in Ljung.
    • Wold decomposition
      • Linearly regular processes
      • Wold decomposition
      • Multivariable considerations
      • Spectral distribution function
      • Spectral factorization
      • Full rank processes
    • Unbiased parameter estimation
      • The Cramér-Rao lower bound
      • Efficient estimators
      • The maximum likelihood estimator
      • Data compression
      • Uniform minimum variance unbiased estimators
      • Best linear unbiased estimator (BLUE)
    • HW: 5.1, 5.5, 5.7 (4.7,5.3)
  5. Biased parameter estimation (Tuesday 2/6, 10-12) . Chapter 6 in LN.
    • The bias-variance trade-off
    • The Cramér-Rao lower bound
    • Average risk minimization
    • Minimax estimation
    • Pointwise risk minimization
    • HW: 6.2a,b (6.2c, 6.3)
  6. Estimating LTI models (Friday 5/6, 15-17). Chapter 7 in LN, Chapter 7 in Ljung.
    • LTI models
    • Maximum likelihood estimation
    • Prediction error methods
    • HW: C.7.1, C.7.2, 7.1a-e (7.1f-g, 7.2, the remaining problems are also very illuminating for  the intricacies of filtering, if you have time do more)
  7. Asymptotic theory (Tuesday 9/6, 08-10). Chapter 8 in L.N. Chapters 8-9 in Ljung (we will cover these chapters in Lecture 9 though)
    • Limits of random variables
    • Large sample properties of estimators
    • Large sample properties of biased estimators
    • HW: 8.1 (8.2)
  8. Modeling and estimation using Gaussian Processes (Friday 12/6, 15-17) Guest lecture by Dr Riccardo Sven Risuleo, Klarna AB
    • Basics, including Mercers theorem
    • Impulse response estimation
    • Estimation of nonlinear systems, including Hammerstein and Wiener models
    • Modeling and estimation of uncertain input systems
  9. Asymptotic theory for the PEM (Friday 26/6, 10-12) Chapter 8 in L.N. Chapter 8, Sections 13.2, 13.4 and 13.5 in Ljung 
    • Identifiability
    • Informative experiments
    • Persistence of excitation
    • Consistency
    • Closed loop identification
    • HW: 9.2, 9.4, C.9.1, (9.1, 9.3)
  10. Asymptotic theory for the PEM  (Friday 28/8, 15-17) Section 9.5 in L.N. Chapter 9  in Ljung 
    • Estimation criteria and the corresponding asymptotic covariance matrices
    • Geometric analysis
    • Reproducing kernel approach
    • SISO LTI systems
    • HW: 9.7, 9.8 C.9.2a-d, (9.5, 9.6, C.9.2e-g)
  11. Experiment design  (Wednesday 2/9). Slides + Chapter 13 in Ljung.
  12. Model structure selection and model validation  (Friday 4/9). Slides + Chapter 16 in Ljung
    • HW: C.9.3, C.9.4, C.9.6d-f, C.9.7, C.9.8  (C.9.6.a-c). Data can be found here.  These exercises do not have to be corrected. Just make sure you understand what is going on.
  13. Computational aspects. Sections 10.1-10,3 and 10.5 in Ljung  + slides
    • Gradient based optimization
    • Convex relaxations
    • Integration by Markov Chain Monte Carlo (MCMC) methods
    • Nonlinear filtering using particle filters and smoothers
    • HW: None. Time to finish up. 
  14. Additional methods. Slides + Sections 7.3, 7.6, 10.4-10.6, 7.6. in Ljung. Own reading Chapters 14-15 and 17 in Ljung 
    • Correlation based methods. 
    • Subspace identification
    • Multi-step least-squares methods
    • Continuous time identification