2011 | OriginalPaper | Chapter
Sparse Recovery with Partial Support Knowledge
Authors : Khanh Do Ba, Piotr Indyk
Published in: Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques
Publisher: Springer Berlin Heidelberg
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
The goal of sparse recovery is to recover the (approximately) best
k
-sparse approximation
$\hat{x}$
of an
n
-dimensional vector
x
from linear measurements
Ax
of
x
. We consider a variant of the problem which takes into account
partial knowledge
about the signal. In particular, we focus on the scenario where, after the measurements are taken, we are given a set
S
of size
s
that is supposed to contain most of the “large” coefficients of
x
. The goal is then to find
$\hat{x}$
such that
$$ \| x-\hat{x}\|_p \le C \min_{\substack{k\text{-sparse }x'\\\textrm{supp}(x') \subseteq S }} \|x-x'\|_q\enspace.$$
We refer to this formulation as the
sparse recovery with partial support knowledge problem (
$\textrm{SRPSK}$
)
. We show that
$\textrm{SRPSK}$
can be solved, up to an approximation factor of
C
= 1 +
ε
, using
O
( (
k
/
ε
) log(
s
/
k
)) measurements, for
p
=
q
= 2. Moreover, this bound is tight as long as
s
=
O
(
εn
/ log(
n
/
ε
)). This completely resolves the asymptotic measurement complexity of the problem except for a very small range of the parameter
s
.
To the best of our knowledge, this is the first variant of (1 +
ε
)-approximate sparse recovery for which the asymptotic measurement complexity has been determined.