EE
527: Detection and Estimation Theory (Spring 2008)

Syllabus |
Updates/Reminders | Deadlines/Exam Dates |
Homeworks |
Exam
Dates |
Handouts/Reading
Material |
Readings/Class Schedule |

- Updates/Reminders
- Homework 8 posted

- New reading material posted
for EM algorithm, altnernating max, and least squares

- Deadlines/exam dates posted - please follow them

- Prerequisites: EE 224, EE 322, Basic calculus & linear alegbra. Co-requisite: EE 523 (highly recommended if you take the exam option)
- Location, Time:
Howe 1242, Tues-Thurs 2:10-3:30

- Instructor: Prof Namrata Vaswani

- Office Hours: Tuesday,
Wednesday 4-5pm, or by appointment, or stop by after 4pm to check if
I'm free

- Office: 3121 Coover Hall

- Email: namrata AT iastate DOT edu Phone: 515-294-4012
- Grading policy
- Homeworks: 10%
- Two midterms (30% each) or One Exam-Project : 60%
- One mandatory project : 30%

- Exam-Project option is only available to non-CSP students (if they want to take it that is)
- Project-related
Deadlines and Exam Dates

- Exam-Project deadlines (for EDE
and in-class)

- I need to know the topic (related to your research) and then I can help you find papers.
- This should get done by February 11
- Abstract with list of papers due (this should be approved by me): February 20
- Final Project report + presentation due by April 1
- Evaluation will be based on report, presentation, emailed code
- This
needs to be a significant piece of work: survey + simulation testing +
application on some real data for a problem from your research area +
comparisons b/w 2-3 methods. I strongly recommend using Matlab.

- Mid-term Exams
- Mandatory Projects
- Should be finalized (abstract + papers submitted to me) by March 25
- Project will be due in dead week
- Evaluation
will be based on report, presentation, emailed code

- Syllabus: See introduction
slides

- Background material: recap of probability, calculus, linear algebra
- Estimation Theory
- Maximum Likelihood estimation (MLE): exact and approximate methods (EM, alternating max, etc)
- Cramer-Rao
lower bound (CRLB)

- Minimum variance unbiased estimation, best linear unbiased estimation
- Bayesian
inference & Least Squares Estimation (from Kailath et al's
Linear Estimation book)

- Basic ideas, adaptive techniques,
Recursive LS, etc

- Kalman filtering (sequential Bayes)
- Finite
state Hidden Markov Models: forward-backward algorithm, Viterbi (ML
state estimation), parameter estimation (f-b + EM)

- Monte
Carlo methods: importance sampling, MCMC, particle filtering,
applications in numerical integration (MMSE estimation or error
probability computation) and in numerical optimization (e.g. annealing)

- Applications: image processing,
speech, communications (to be discussed with each topic)

- Detection Theory
- Likelihood Ratio testing, Bayes detectors,

- Minimax detectors,
- Multiple hypothesis tests

- Neyman-Pearson detectors (matched filter, estimator-correlator etc),
- Wald sequential test,

- Generalized likelihood ratio tests (GLRTs), Wald and Rao scoring tests,
- Applications
- Syllabus is similar toProf. Dogandzic's EE527 but I will cover least squares estimation, Kalman filtering and Monte Carlo methods in more detail and will discuss some image/video processing applications also. Note that LSE, KF are also covered in EE524, but different perspectives are always useful
- Books:
- Textbook: S.M. Kay's Fundamentals of Statistical Signal Processing: Estimation Theory (Vol 1), Detection Theory (Vol 2)
- References
- Kailath, Sayed and Hassibi, Linear
Estimation

- V. Poor,
*An Introduction to Signal Detection and Estimation* - H.Van Trees, Detection, Estimation, and Modulation Theory
- J.S. Liu,
*Monte Carlo Strategies in Scientific Computing.*Springer-Verlag, 2001. - B.D. Ripley,
*Stochastic Simulation.*Wiley, 1987.

- Disability accomodation: If you have a documented disability and anticipate needing accommodations in this course, please make arrangements to meet with me soon. You will need to provide documentation of your disability to Disability Resources (DR) office, located on the main floor of the Student Services Building, Room 1076 or call 515-294-7220.
- Homeworks

- Homework
1:

- Problems 2.1, 2.4, 2.7, 2.9, 2.10 of Kay-I. Also think about 2.8 (if you have the background)
- Due: Tues Jan 29, Off campus: always get 1 extra week.
- Homework 2
- Problems
5.2, 5.3, 5.4, 5.5, 5.7, 5.13, 5.16

- Due: Tues Feb 5 (give me in Monday's makeup class or slip it into my office anytime on Tuesday), Off campus: one extra week
- Homework 3
- Problems
3.1, 3.3, 3.9, 3.11 + Do the following sets of problems.

- Compute the covariance matrix of
the MVUE estimators of mu and sigma^2 computed from N i.i.d.
Gaussian observations and verify that it is not equal to the CRB

- Do problems in this practice
set (will be graded for completion,ignore deadlines written on it)

- Due: Thurs Feb 13, Off campus: one extra week
- Homework
4

- For
each problem: think about the
implication also and write it - that's more interesting than just the
algebra!

- Part
1 (due Thurs Feb 21) : 4.2, 4.5, 4.6, 4.10, 4.13, 4.14

- Problem 4.6:

- What is the missed
detection
probability assuming P_hat is Gaussian distributed with computed mean
and variance and you use a detection threshold of E[P_hat]/2

- Part
2 (due Mon Feb 25): 6.1, 6.2, 6.5, 6.7, 6.9, 6.16

- Modified
Extra:
6.8, 6.14 (you can skip 6.10)

- In-class:
Part 1 due Feb 21, Part 2 due Feb 25,

- EDE:
Entire Homework 4 due Feb 28

- Homework
5: Due March 6 (oncampus), March
10 (EDE)

- Take each exam problem and
compute the ML estimate (I will email the exam once you've taken it, if
I forget, please get it from me).

- Problems 7.6, 7.7, 7.14, 7.18, 7.19
- Homework 6: Due Thursday March 27 (for both EDE and oncampus)
- Homework 7: Good to submit by April 3 (if taking exam). Due April 7 (for both EDE and oncampus)
- Homework 8
- Kay-II (detection book), Problems 2.2, 2.6, 3.1, 3.8, 3.18, 3.20
- Practice set: Kay-II, Problems 2.4, 3.7, 3.9
- Homework 9 : Due Tuesday April 29

- Kay II (detection book)
Problems 4.8, 4.24, 4.28, 5.11, 5.17, 5.19

- Practice set: 4.21, 5.23, 5.18, 5.2
- If you find any typos or
something missing, please email me. I will check and email you in a few
days' time.

- Exam Dates
- Exam 1: Feb 26, Tuesday
- Syllabus: Chapters 2,3,4,5,6, and homeworks 1, 2, 3, 4. Also handouts 1,2,3 and part of 4 (skip what I did not teach)
- Exam 2: April 3
- Syllabus:
ML estimation (Chapter 7), Least Squares estimation (my
notes), Bayesian estimation (my notes, Poor's book)

- Handouts

- Introduction
of material, syllabus, grading etc

- Probability, Signal Processing and Linear
Algebra
Recap:

- Introduction
and Chapter 1 slides

- Single Random Variable: Discrete & Continuous
- Multiple Random Variables: Discrete, Multiple Random Variables: Continuous
- Some
EE 322 homework problems to refresh your minds

- Signal processing review
- Linear algebra review: http://www.maths.mq.edu.au/~wchen/lnlafolder/lnla.html (Chapters 5-10, Chap 11,12 may also be useful)
- Classical Estimation Theory

- H1: Minimum Variance Unbiased Estimation
(Prof. ALD's notes), Chapter 2, 3, 5 of Kay-I

- MVUE:
Addendum on Complete Sufficient Statistics and RBLS Theorem (Sec 5.5,
5.6 of Kay-I) : see handouts in WebCT

- Jointly
Gaussian Random Variables:
see handout in WebCT from Poor.

- H2: Cramer Rao Bound and Efficient Estimators
(Prof. ALD's notes), Chapter 5 of Kay-I

- H3: Linear models, Best Linear Unbiased Estimator (Prof ALD's notes), Chapters 4, 6 of Kay-I
- Maximum Likelihood (ML) Estimation:

- Chapter 7 of Kay-I

- Expectation Maximization: review
paper 1 intuition for
its convergence more rigorous
version original
Dempster et al paper

- K means clustering (a popular application of alternating
maximization): K
means algorithm K
means demo

- Bayesian Estimation Theory

- Least squares (non-Bayesian and Bayesian): Least squares estimation handout
- MMSE
estimation and jointly Gaussian random variables (covered in class):
Poor Chapter IV-B

- Kalman filtering: Kalman
Filtering handout, Poor
Chapter V

- Hidden Markov Models: Hidden Markov Models (HMM) tutorial by Rabiner
- H4: Bayesian inference (Prof. ALD's notes)
- Detection
Theory:

- Chapters 2,3,4,5,6 of Kay-II, Chapter 3 of Poor, plus additional material (Chernoff bounds etc)

- Monte Carlo Methods:
- Importance Sampling, Bayesian IS:class notes, H4b (Prof ALD's notes)
- Sequential
(Bayesian) Importance Sampling, Importance Resampling and idea of
particle filtering: PF handout,
class notes

- Generating a random variable using a Unif (0,1) r.v. : H4b
- Rejection sampling: H4b
- Markov
Chains: link to EE 523 handout
5, handout
6

- Markov Chain Monte Carlo (MCMC) idea: Metropolis Hastings and why it works: parts of H4c (Prof ALD's notes)

- Compressed Sensing or Compressive Sampling (CS):
- Compressed Sensing (my slides)
- Compressed Sensing: my notes (just the
basic idea)

- Link to CS papers and tutorials (Rice univ. collection)
- IEEE
Signal Proc Mag. March 2008: special issue on CS

- Key
applications: MRI or CT reconstruction, Single-pixel camera, Coherent
sensing of distributed phenomena, and many more appear everyday!

- Some
of my old handouts (prepared for other classes)

- Just
for you to get a feel of some topics, I will modify these and post when
I teach the topic

- Particle Filtering (or Sequential Importance Sampling)
- Calculus of variations: basic idea
- Other estimation related image processing/computer vision topics
- Optical flow estimation (estimate motion of each pixel)
- Registration
- Readings
- Week
1: Probability
handouts posted above

- Week
2: Chapters 2, 5 of Kay-I + Handout H1 (MVUE)

- Week
3: Chapter 3 of Kay-I + Handout H2

- Week 4: Chapter 4 of Kay-I
- Week
5, 6: ML estimation, Chapter 7.

- .....

- Next week: HMM (paper posted), Exam.
- April
7, 14 weeks: Monte Carlo introduction, Detection theory.

Back to Namrata Vaswani's homepage