**Spring 2013, EE 520: Special Topics
class on Sparse Recovery and Matrix Completion**

This will be a special topics course in which we will discuss recent work on (i) sparse recovery / compressive sensing (ii) low-rank matrix completion, (iii) robust low-rank matrix completion (also referred to as robust PCA), and (iv) their applications. In the first month, we will a review the background needed to understand these papers: (i) linear algebra, (ii) convex optimization and (iii) probability. The rest of the semester will involve a discussion of the key papers on these topics.

Matrix completion solves the following problem: given a matrix with some missing or corrupted entries, how do I recover the original uncorrupted matrix if I know that it is low-rank? In matrix completion, the location of the corrupted entries is known, whereas in robust matrix completion or robust PCA, the corrupted locations are also unknown. The corruptions (outliers) can be very large in magnitude, but occur in only a few rows or columns and thus can be modeled using a sparse matrix.

Sparse Recovery (Compressive Sensing) answers the following question: how and when can I reconstruct a signal using fewer measurements than the signal length, but using the fact that the signal is sparse or approximately sparse in some domain.

**Class time and Location:****Tues-Thurs 9:30-10:50, location: 1041 Coover****Instructor:**Prof. Namrata Vaswani**Office Hours:**Tues 2-3, Wed 10-11 or by appointment**Email:**namrata AT iastate.edu,**Phone:**515-294-4012**Office:**3121 Coover Hall**Grading:**- 10% class participation
- 40% scribe notes for
one paper that I will present
- 50% term paper (read
and present on a paper/topic for one lecture, submit slides and a short report)
**Prerequisites/Corequisites:****EE 523 (or at least EE 322 level knowledge), good if you also did EE 570****Disability accommodation:**If you have a documented disability and anticipate needing accommodations in this course, please make arrangements to meet with me soon. You will need to provide documentation of your disability to Disability Resources (DR) office, located on the main floor of the Student Services Building, Room 1076, 515-294-7220.**Papers that can be presented by students: go to bottom of this page.**

**Tentative
List of Topics/Papers (suggestions from students are welcome)**

- Recap of linear algebra, probability and optimization
- Linear Algebra :
parts of Chapters 0, 1, 2, 4, 5 of Matrix Analysis, Horn and
Johnson
- Optimization: subset of
slides of Vandenberghe and Boyd: EE364a, EE364b
- Probability:
Quick recap of EE322 notes, law of
large numbers, high probability tail bounds for random matrix eigenvalues
- Sparse Recovery / Compressive Sensing
- Introduction:
- Convex relaxation
methods
- Decoding
by linear programming (basis pursuit)
- The
Restricted Isometry Property and Its Implications for Compressed Sensing (constrained BPDN)
- Just
Relax: Convex programming methods for identifying sparse signals
(BPDN with penalty)
- Greedy methods
- CS resource page
- Some Applications
- Low-rank Matrix Completion
- Robust Low-rank Matrix Completion / Robust PCA

**Partial list of papers that can be presented **

·
Greedy
algorithms:

o
Subspace
Pursuit paper

o
CoSaMP
paper

- Structured
sparse recovery
- Richard
G. Baraniuk, Volkan Cevher, Marco F. Duarte, Chinmay Hegde, Model-based
Compressive Sensing, IEEE Trans. Info Th., April 2010
- Renliang Gu
- MMV
problem: MUSIC based algorithms
- Jong Min Kim, Ok Kyun Lee, Jong Chul Ye, Compressive MUSIC: Revisiting the Link
Between Compressive Sensing and Array Signal Processing, IEEE Trans. Info
Th, Jan 2012
- Han Guo
- Kiryung Lee, Yoram Bresler, Marius Junge:
Subspace Methods for Joint Sparse Recovery. IEEE Transactions on
Information Theory 58(6): 3613-3641 (2012)

·
Sparse
recovery in sparse noise (outlier)

o
J.
Wright and Y. Ma, “Dense error correction via l1-minimization”, IEEE Trans. on
Info. Th., vol. 56, no. 7, pp. 3540–3560, 2010.

·
Animesh
Biswas

·
Matrix
Completion, greedy algorithm

o
K. Lee and Y. Bresler, "ADMiRA: Atomic
Decomposition for Minimum Rank Approximation", IEEE Transactions on
Information Theory, vol. 56, No. 9, Sep. 2010.

·
Robust
Matrix Completion / Robust PCA

o
V.
Chandrasekaran, S. Sanghavi, P. A. Parrilo, and A. S. Willsky, “Rank-sparsity
incoherence for matrix decomposition”, SIAM Journal on Optimization, vol. 21,
2011.

·
Brian
Lois

o
H.
Xu, C. Caramanis, and S. Sanghavi, “Robust pca via outlier pursuit”, IEEE Tran.
on Information Theorey, vol. 58, no. 5, 2012.38

·
Kevin Palmowski

o
M.
McCoy and J. Tropp, “Two proposals for robust pca using semideﬁnite
programming”, arXiv:1012.1086v3, 2010.

·
Nicole
Kingsley

·
J.
A. Tropp, “User-friendly tail bounds for sums of random matrices”, Foundations
of Computational Mathematics, vol. 12, no. 4, 2012