This is an introductory course to statistical estimation theory given from a signal processing perspective. The aim is to provide the basic principles and tools which are useful to solve many estimation problems in signal processing and communications. It will also serve as the necessary prerequisite for more advanced texts and research papers in the area. The course will cover fundamental concepts such as sufficient statistics, the Rao-Blackwell theorem and the Cramer-Rao lower bound on estimation accuracy. Furthermore, the most common estimation methods are treated, including maximum likelihood, least-squares, minimum variance, method of moments and Bayesian estimation. The course assumes some familiarity with basic matrix theory and statistics.
After the course the student should be able to:
1. Describe the difference between the classical and Bayesian approach to estimation; describe the notions of estimator bias, variance, and efficiency; and describe the notion of sufficient statistics and its meaning in minimum variance unbiased (MVU) estimation.
2. Formulate system models and parameter estimation problems and derive corresponding Cramer-Rao lower bounds and sufficient statistics. Prove optimality of estimators.
3. Apply appropriate estimators including linear, least squares, maximum likelihood, and method of moments estimators after considering estimation accuracy and complexity requirements
4. Work with both real and complex valued data models.
5. Solve and analyze a real world estimation problem.
Course main content: Introduction, Minimum Variance Unbiased Estimation, Cramer- Rao Lower Bound, Linear Estimators, Maximum Likelihood, Least Squares, The Method of Moments, Bayesian Methods, Extension to Complex Data
Sufficiency in probability theory, calculus and linear algebra (matrix analysis useful but not required).
Steven M. Kay, Fundamentals of Statistical Signal Processing: Estimation Theory, Pren- tice Hall, ISBN 0-13-345711-7
Access to a computer with Matlab for projects.