EE 527: Detection and Estimation Theory
(Spring 2012)
- Updates/Reminders
- Prerequisites: EE 224, EE 322, Basic calculus & linear alegbra. Suggested class to also take: EE 523
- Location, Time: Marston 204, Tues-Thurs 2:10-3:30pm
- Instructor: Prof
Namrata Vaswani
- Office Hours:
Monday 11-12, Tuesday 10-11, or by appointment, or stop by after 4pm to
check if I'm free
- Office: 3121 Coover Hall
- Email: namrata
AT iastate DOT edu
Phone: 515-294-4012
- Grading policy
- Homeworks: 10%
- One midterm
and one final exam : 30% *2 = 60%
- One
project / term paper: 30%
- Exam
Dates and Project Details and Deadlines
- Exam
dates
- Midterm
exam: Tuesday before Spring break
- Final
exam: during finals week, see exams schedule
- Project details: TBD
- Syllabus:
- Background material:
recap of probability, calculus, linear algebra
- Estimation Theory
- Minimum variance
unbiased estimation, best linear unbiased estimation
- Cramer-Rao lower bound
(CRLB)
- Maximum Likelihood
estimation (MLE): exact and approximate methods (EM, alternating max,
etc)
- Bayesian inference
& Least Squares Estimation (from Kailath
et al's Linear Estimation book)
- Basic ideas, adaptive
techniques, Recursive LS, etc
- Kalman filtering
(sequential Bayes)
- Finite state Hidden Markov
Models: forward-backward algorithm, Viterbi
(ML state estimation), parameter estimation (f-b + EM)
- Graphical Models
- Applications: image processing,
speech, communications (to be discussed with each topic)
- Sparse Recovery and
Compressive Sensing introduction
- Monte Carlo methods:
importance sampling, MCMC, particle filtering, applications in numerical
integration (MMSE estimation or error probability computation) and in
numerical optimization (e.g. annealing)
- Detection Theory
- Likelihood Ratio
testing, Bayes detectors,
- Minimax detectors,
- Multiple hypothesis
tests
- Neyman-Pearson detectors
(matched filter, estimator-correlator etc),
- Wald sequential test,
- Generalized likelihood
ratio tests (GLRTs), Wald and Rao scoring tests,
- Applications
- Syllabus is similar toProf. Dogandzic's EE527 but I will
cover least squares estimation, Kalman
filtering and Monte Carlo methods in more detail and will discuss some
image/video processing applications also. Note that LSE, KF are also
covered in EE524, but different perspectives are always useful
- Books:
- Textbook: S.M. Kay's
Fundamentals of Statistical Signal Processing: Estimation Theory (Vol 1), Detection Theory (Vol
2)
- References
- Kailath, Sayed
and Hassibi, Linear Estimation
- V. Poor, An
Introduction to Signal Detection and Estimation
- H.Van Trees, Detection, Estimation, and Modulation Theory
- J.S. Liu, Monte
Carlo Strategies in Scientific Computing. Springer-Verlag, 2001.
- B.D. Ripley, Stochastic
Simulation. Wiley, 1987.
- Disability accomodation: If you have a
documented disability and anticipate needing accommodations in this
course, please make arrangements to meet with me soon. You will need to provide
documentation of your disability to Disability Resources (DR) office,
located on the main floor of the Student Services Building, Room 1076 or
call 515-294-7220.
- Homeworks
- Homework 8
- Homework 7: Due
Thursday March 29
- Homework 6: Due March
2 Thursday
- Problems 7.6, 7.7, 7.14, 7.18, 7.19
- Problems 8.24, 8.26, 8.27 (skip the Newton Raphson part)
- Practice problems (I will suggest doing at least two): 8.4,
8.12, 8.28, 8.29
- Homework
5: Due Feb 28 Tuesday
- For
each problem: think about the implication also and write it - that's
more interesting than just the algebra!
- Problems
4.2, 4.5, 4.6, 4.10, 4.13, 4.14
- Problems:
6.1, 6.2, 6.5, 6.7, 6.9, 6.16
- Extra
credit: 6.8, 6.14, 6.10
- Problem
4.6: What
is the missed detection probability assuming P_hat
is Gaussian distributed with computed mean and variance and you use a
detection threshold of E[P_hat]/2
- Homework 4: postponed: due
Thursday Feb 16. due Tuesday
February 14
- Problems 3.1, 3.3, 3.9, 3.11
- Do the following sets of problems: practice set (will
be graded for completion, ignore deadlines written on it)
- Homework 3: DUE DATE POSTPONED TO THURSDAY FEB 9. (due Tuesday February 7)
- Problems 5.2, 5.3,
5.4, 5.5, 5.7, 5.13, 5.16
- Compute the covariance
matrix of the MVUE estimators of mu and sigma^2 computed
from N i.i.d. Gaussian observations and verify
that it is not equal to the CRB
- Homework 2: due Tues, Jan 31
- Problems
2.1, 2.4, 2.7, 2.9, 2.10 of Kay-I. Bonus: 2.8
- Homework 1: due Thurs, Jan 19 – postponed, due Mon, Jan 23.
- Handouts
- Introduction slides (do not use
grading policy from this one - refer to the grading policy given above)
- Review
- Classical Estimation
- Sparse Recovery /
Compressive sensing
- Bayesian estimation
- MMSE and linear MMSE
estimation and Kalman filtering
- Some extra things
- Graphical models
- Graphical
models
(Prof. ALD's notes) approach for handling conditional dependencies in
Bayesian estimation (Prof ALD's handout)
- Hidden Markov Models
(HMM)
- Detection Theory:
- Monte Carlo
- Simple MC and
Importance Sampling (IS)
- Markov Chain Monte Carlc (MCMC)
- Particle filtering
- Particle filtering
- Doucet et al's paper
(2000)
- HMM model and other
algorithms
- Importance sampling
to approximate a PDF (sum of weighted Diracs)
- Sequential Importance
Sampling (SIS)
- Resampling concept
- Particle filtering
algorithm: SIS + Resample
- Probability and Linear
Algebra Recap
- Undergrad probability
review (EE 322)
- Linear algebra
review:
-
- Readings
- Week 1: Probability
handouts posted above
- Week 2: Chapters 2, 5
of Kay-I + Handout H1 (MVUE)
- Week 3: Chapter 5 (and
Completeness handout in WebCT ), Chapter 3
(and Handout H2 (CRLB))
- Week 4: Chapter 3,
Chapter 4 and Chapter 7 (CRLB and MVUE, Linear Models, BLUE, ML
Estimation) and Handout H2, H3
Back to Namrata Vaswani's homepage