TransWikia.com

Sparse signal recovery (nonlinear case)

MathOverflow Asked by Sébastien Loisel on January 24, 2021

Let $K subset mathbb{R}^n$, it may be that $K$ is "very thin" (e.g. $K$ is a $k$-dimensional affine subset of $mathbb{R}^n$, with $k ll n$). I’m interested in the case where $K$ is nonconvex, but for theoretical elegance, one can begin with the case where $K$ is indeed convex. Let $s ll n$ (the "sparsity"). Consider the sparse signal recovery problem:
$$tag{1} label{e:genprob} text{Find } x^* in K text{ such that }|x^*|_0 leq s,$$
where $|cdot|_0$ counts the number of nonzero entries.

My question: Although eqref{e:genprob} is NP-hard in general, in some cases (almost any affine $K$, using LASSO/compressed sensing/convex optimization/$ell^1$ regularization) one can solve eqref{e:genprob} in polynomial time. As far as I can tell, these efficient algorithms do not work when $K$ is a general convex set (or worse, a fully general set!) Can anyone point me to some literature/algorithms for solving eqref{e:genprob}, e.g. when $K$ is convex, with the same or similar astonishing efficiency of the LASSO/compressed sensing? Are there any books or papers on this? Is there a theory for it? Is it an open problem? Aside from the convex optimization/compressed sensing/LASSO literature, I also looked in "random sampling" (linear algebra) because I use that, but didn’t find it there either.

Mathematical discussion

This is related to sparse signal recovery, in the following way. Assume that $f(x)$ is some function, and consider the problem:
$$label{e:optprob} alpha^* = tag{2} inf_{x in mathbb{R}^n} f(x).$$
One assumes there is some $x^*$ with $|x^*|_0 = s$, such that $f(x^*)$ is minimal. Then, problem eqref{e:optprob} can be converted to problem eqref{e:genprob} by setting
$$tag{3} label{e:Kset} K = { x in mathbb{R}^n ; : ; f(x) = alpha^* }.$$
Consider the following relaxation of eqref{e:genprob}, eqref{e:Kset}
$$tag{4} label{e:lasso} inf_{z in mathbb{R}^n ; : ; |z|_1 leq t} f(z).$$
When $f(x) = |b-Ax|_2$ then $K$ is an affine subspace of $mathbb{R}^n$ and eqref{e:lasso} is called the LASSO method. It is well-known that, for almost every $b,A$ that have a sparse solution $x^*$ of eqref{e:genprob}, then $z^*=x^*$ when $t = |x^*|_1$. However, this quickly breaks down in the nonlinear case. For example, let $K subset mathbb{R}^n$ be a closed convex set that contains an $s$-sparse point $x^*$ and $0 notin K$, and define $f(x) = d(x,K)$ the Euclidian distance between $x$ and $K$. For almost any such $K$, eqref{e:lasso} fails to recover a sparse solution of eqref{e:genprob} for any $tgeq 0$.

I have been working on an alternate convex relaxation of eqref{e:genprob}, and I believe that my relaxation recovers the solution of eqref{e:genprob} with high probability even if $K$ is nonaffine. Because my relaxation is convex, it admits highly efficient solvers. I tried to find references on this, since it seems like a very canonical problem, and I have failed, to my surprise.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP