Abstract

We investigate the behavior of the mean-square error (MSE) of low-rank and sparse matrix decomposition, in particular the special case of the robust principal component analysis (RPCA), and its generalization matrix completion and correction (MCC). We derive a constrained Cramér-Rao bound (CRB) for any locally unbiased estimator of the low-rank matrix and of the sparse matrix. We analyze the typical behavior of the constrained CRB for MCC where a subset of entries of the underlying matrix are randomly observed, some of which are grossly corrupted. We obtain approximated constrained CRBs by using a concentration of measure argument. We design an alternating minimization procedure to compute the maximum-likelihood estimator (MLE) for the low-rank matrix and the sparse matrix, assuming knowledge of the rank and the sparsity level. For relatively small rank and sparsity level, we demonstrate numerically that the performance of the MLE approaches the constrained CRB when the signal-to-noise-ratio is high. We discuss the implications of these bounds and compare them with the empirical performance of the accelerated proximal gradient algorithm as well as other existing bounds in the literature.

Original languageEnglish
Article number5953532
Pages (from-to)5070-5076
Number of pages7
JournalIEEE Transactions on Signal Processing
Volume59
Issue number10
DOIs
StatePublished - Oct 2011

Keywords

  • Accelerated proximal gradient algorithm
  • constrained Cramér-Rao bound
  • matrix completion and correction
  • maximum likelihood estimation
  • mean-square error
  • robust principal component analysis

Fingerprint

Dive into the research topics of 'Constrained Cramér-Rao bound on robust principal component analysis'. Together they form a unique fingerprint.

Cite this