Friday, March 20, 2015

Infinite-dimensional $\ell^1$ minimization and function approximation from pointwise data


Infinite-dimensional $\ell^1$ minimization and function approximation from pointwise data by Ben Adcock
We consider the problem of approximating a function from finitely-many pointwise samples using $\ell^1$ minimization techniques. In the first part of this paper, we introduce an infinite-dimensional approach to this problem. Three advantages of this approach are as follows. First, it provides interpolatory approximations in the absence of noise. Second, it does not require a priori bounds on the expansion tail in order to be implemented. In particular, the truncation strategy we introduce as part of this framework is completely independent of the function being approximated. Third, it allows one to explain the crucial role weights play in the minimization, namely, that of regularizing the problem and removing so-called aliasing phenomena. In the second part of this paper we present a worst-case error analysis for this approach. We provide a general recipe for analyzing the performance of such techniques for arbitrary deterministic sets of points. Finally, we apply this recipe to show that weighted $\ell^1$ minimization with Jacobi polynomials leads to an optimal, convergent method for approximating smooth, one-dimensional functions from scattered data.
 
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly