Till KTH:s startsida Till KTH:s startsida

Ändringar mellan två versioner

Här visas ändringar i "Outline" mellan 2018-10-06 16:27 av Håkan Hjalmarsson och 2018-10-06 17:51 av Håkan Hjalmarsson.

Visa < föregående | nästa > ändring.

Outline

0 Introduction

1 Essentials of Signals, Systems and Stochastic Processes 1.1 Probability Theory .1.1.1 Random Variables 1.1.2 Probability Distribution, Density and Events Bayesian Perspective Joint and Marginal Probability Conditional Probability and Bayes’ rule Operations on Random Variables 1.1.3 Expectation Variance, Covariance and Correlation Moments and the Moment Generating Function1.1.4 Common Probability Density Functions Uniform Density Gaussian Density Multivariable Gaussian Density Chi-Squared Density1.2 Stochastic Processes1.2.1 Stationary Processes and the (Auto) Covariance and (Auto) Correlation functions.1.2.2 Cross-Covariance and Cross-Correlation Functions1.2.3 Power Spectral Density1.2.4 Linear Systems subject to stochastic input 1.3 Quasi-stationary signals 1.4 Stochastic Convergence 1.4.1 Convergence in Mean1.4.2 Convergence in Probability 1.4.3 Convergence with Probability 1¶ 1.4.4 Convergence in Distribution2 Stochastic Processes1.3 Quasi-stationary signals 1.4 Stochastic Convergence

2 Estimation Methods2.1 Minimum Mean Square Error Estimation2.2 Maximum A Posteriori Estimation 2.3 Unbiased Parameter Estimation

Sufficient statistics, score function, Cramér-Rao lower bound, Maximum Likelihood, Rao-Blackwell
3 Minimum Mean Square Error Parameter Estimation3.1 The Bias-Variance Error Trade-Off3.2 Risk and Average Risk. The Bayes Estimator Risk estimation methods

SURE, Empirical Bayes, Variational Bayes

3.3 Linear in the Parameters Models

4 Numerical algorithms¶

4.1 Optimization¶

4.2 Likelihood optimization - the EM-method¶

4.2 Sampling ¶

Markov Chain Monte Carlo methods - Metropolis Hastings, Gibbs¶

5
Linear in the Parameters Models

56 Dynamical Models56.1 Model Structures and Probabilistic Models56.2 Estimation Methods56.2.1 Maximum Likelihood Estimation56.2.2 The Extended Invariance Principle 56.2.3 The Prediction Error Method5

Optimal filtering, the Kalman filter, particle filtering and smoothing¶

6
.2.4 Multi-Step Least-Squares Methods56.2.5 Instrumental Variable Methods56.2.6 Indirect Inference56.3 Linear Models56.3.1 Maximum Likelihood Estimation56.3.2 The Prediction Error Method56.4 Multi-Step Least-Squares Methods56.5 Subspace Identification56.6 Instrumental Variable Methods56.7 Bayesian Methods56.8 Time versus Frequency Domain Identification56.9 Continuous Time Model Identification

56.10 Grey-Box Identification

67 Model Quality67.1 Variance Quantification67.1.1 Fundamental Geometric Principles 67.1.2 Fundamental Structural Results 67.1.3 Variability of Estimated Frequency Response67.1.4 Variability of Nonlinear System Estimates67.1.5 Bootstrap Methods78 Experiment Design78.1 Identifiability 78.2 Persistence of Exciation78.3 Input Signal Design78.3.1 Common Input Signals PRBS Sums of Sine-Waves and Crest Factor Correction78.4 Application Oriented Experiment Design78.5 Adaptive Experiment Design 8

9
Model Validation89.1 Residual whiteness Tests 89.2 Input to residual correlation tests89.3 Model Error Modelling 9

10
Application Examples910.1 Closed Loop Identification910.2 Network Models910.3 Errors-in-Variables Models910.4 Block-structured Nonlinear Models

910.5 Identification for Control